1 00:00:00,040 --> 00:00:01,880 Speaker 1: The podcast you were about to listen to contains a 2 00:00:01,960 --> 00:00:06,000 Speaker 1: class for information hazard. Some listeners may experience prolonged bouts 3 00:00:06,000 --> 00:00:10,080 Speaker 1: of fear, waking, anxiety, or nightmares of eternal torture. In 4 00:00:10,119 --> 00:00:13,400 Speaker 1: the cyber dungeons of the Great Basilisk attended to by 5 00:00:13,400 --> 00:00:16,400 Speaker 1: the Peelers in Black and the Thirteen Children of the Flame, 6 00:00:16,800 --> 00:00:24,680 Speaker 1: also appetite, loss, and constipation. Proceed with caution. Welcome to 7 00:00:25,000 --> 00:00:34,000 Speaker 1: Stuff to Blow your Mind from how stuff Works dot com. 8 00:00:34,040 --> 00:00:35,720 Speaker 1: Hey are you welcome to Stuff to Blow your Mind? 9 00:00:35,800 --> 00:00:38,760 Speaker 1: My name is Robert Lamb, and I'm Joe McCormick. And 10 00:00:38,920 --> 00:00:42,920 Speaker 1: since it's October, we are of course still exploring monsters, 11 00:00:43,120 --> 00:00:46,200 Speaker 1: terrifying ideas and so forth, and boy, have we got 12 00:00:46,200 --> 00:00:48,640 Speaker 1: one for you today. I just want to issue a 13 00:00:48,720 --> 00:00:52,640 Speaker 1: warning right at the beginning here that today's episode is 14 00:00:52,680 --> 00:00:56,440 Speaker 1: going to concern something that a few people would consider 15 00:00:56,840 --> 00:01:00,360 Speaker 1: a genuine information hazard, as in an idea you that 16 00:01:00,520 --> 00:01:04,440 Speaker 1: is itself actually dangerous. Now I I don't, having looked 17 00:01:04,440 --> 00:01:06,360 Speaker 1: into it, I don't think that is the case. I 18 00:01:06,400 --> 00:01:09,399 Speaker 1: don't think this episode will hurt you, but just a 19 00:01:09,440 --> 00:01:13,440 Speaker 1: warning if you think you're susceptible to terror or nightmares 20 00:01:13,520 --> 00:01:16,880 Speaker 1: or something when presented with a thought experiment or the 21 00:01:16,920 --> 00:01:20,840 Speaker 1: possibility of being, say, sent to a literal hell created 22 00:01:20,840 --> 00:01:24,000 Speaker 1: by technology, and you think that idea could infect you, 23 00:01:24,000 --> 00:01:26,280 Speaker 1: could make you afraid. This might not be the episode 24 00:01:26,360 --> 00:01:29,080 Speaker 1: for you, right, But then again, I assume you're a 25 00:01:29,080 --> 00:01:31,919 Speaker 1: listener to stuff to blow your mind. You've probably already 26 00:01:31,959 --> 00:01:36,440 Speaker 1: encountered some thought hazards on here. You've survived those. Generally speaking, 27 00:01:36,520 --> 00:01:38,959 Speaker 1: I have faith in you to survive this one. However, 28 00:01:39,000 --> 00:01:41,280 Speaker 1: if you are going to take either of our warnings seriously, 29 00:01:41,640 --> 00:01:43,720 Speaker 1: I will let you know that the first section of 30 00:01:43,720 --> 00:01:46,959 Speaker 1: this podcast is going to deal with the mythical basilisk, 31 00:01:47,680 --> 00:01:50,320 Speaker 1: the folkloric basilisk, and some of the you know, the 32 00:01:50,360 --> 00:01:53,600 Speaker 1: monstrous fund to be had there before we explore the 33 00:01:53,680 --> 00:01:57,280 Speaker 1: idea of Rocco's Basilisk, and in that we're gonna be 34 00:01:57,520 --> 00:02:02,320 Speaker 1: talking about this, uh, this idea that emerges where technological singularity, 35 00:02:02,400 --> 00:02:07,120 Speaker 1: navel gazing, thought experimentation, a little that dash of creepy pasta, 36 00:02:07,240 --> 00:02:10,920 Speaker 1: and some good old fifth fashion supernatural thinking all converge 37 00:02:11,000 --> 00:02:14,680 Speaker 1: into this kind of nightmare scenario. Now, as we said, 38 00:02:14,720 --> 00:02:18,480 Speaker 1: this idea is believed by some to be a genuinely 39 00:02:18,680 --> 00:02:21,679 Speaker 1: dangerous idea and that even learning about it could put 40 00:02:21,720 --> 00:02:23,520 Speaker 1: you at some kind of risk. I think there are 41 00:02:23,560 --> 00:02:25,919 Speaker 1: strong reasons to believe that this is not the case, 42 00:02:26,120 --> 00:02:28,520 Speaker 1: and that thinking about this idea will not put you 43 00:02:28,560 --> 00:02:31,480 Speaker 1: at risk. But again, if you're concerned, you should stop 44 00:02:31,560 --> 00:02:34,200 Speaker 1: listening now, or stop listening after we stopped talking about 45 00:02:34,200 --> 00:02:37,200 Speaker 1: the mythical basilisk now I have I just want to 46 00:02:37,200 --> 00:02:40,520 Speaker 1: say at the beginning, listeners have suggested us talk about 47 00:02:40,639 --> 00:02:45,119 Speaker 1: Rocco's basilisk before this idea that is at least purportedly 48 00:02:45,240 --> 00:02:50,119 Speaker 1: an information hazard, a dangerous idea, and I've I've hesitated 49 00:02:50,160 --> 00:02:52,280 Speaker 1: to do it before, and not because I think it's 50 00:02:52,320 --> 00:02:56,720 Speaker 1: particularly plausible, but just because, you know, I wonder, what 51 00:02:56,960 --> 00:03:00,680 Speaker 1: is the level of risk could that you should tolerate 52 00:03:00,760 --> 00:03:03,800 Speaker 1: when propagating an idea. If you think an idea is 53 00:03:03,919 --> 00:03:07,720 Speaker 1: unlikely but maybe has a zero point zero zero zero 54 00:03:07,840 --> 00:03:11,880 Speaker 1: zero one percent chance of causing enormous harm to the 55 00:03:11,919 --> 00:03:14,480 Speaker 1: person you tell it to, should you say the idea 56 00:03:14,639 --> 00:03:16,600 Speaker 1: or not? I don't know. I feel like people generally 57 00:03:16,639 --> 00:03:19,160 Speaker 1: don't exercise that kind of caution when they're like sharing 58 00:03:19,240 --> 00:03:21,520 Speaker 1: links with you, Like sometimes they'll be like, like not 59 00:03:21,639 --> 00:03:24,840 Speaker 1: safe for work. But but then you click on it anyway, 60 00:03:24,840 --> 00:03:26,840 Speaker 1: and then sometimes you're like, oh, I wish I had 61 00:03:26,880 --> 00:03:28,760 Speaker 1: not seen that, or I wish I had not read that, 62 00:03:29,000 --> 00:03:31,080 Speaker 1: and now that's in my head. Now that's in my 63 00:03:31,160 --> 00:03:33,760 Speaker 1: head forever. Well. One of the problems with this idea 64 00:03:33,840 --> 00:03:36,880 Speaker 1: is whatever you think about whether or not you should 65 00:03:36,920 --> 00:03:39,720 Speaker 1: discuss ideas that may be dangerous to hear in some 66 00:03:39,840 --> 00:03:43,760 Speaker 1: extremely unlikely off chance. Uh. Part of the problem is 67 00:03:43,800 --> 00:03:46,600 Speaker 1: what happens when those ideas are just already set loose 68 00:03:46,680 --> 00:03:49,680 Speaker 1: in in society. I mean now people on television shows 69 00:03:49,760 --> 00:03:52,320 Speaker 1: and all over the internet or talking about this idea. 70 00:03:52,360 --> 00:03:54,080 Speaker 1: There are a bunch of articles out about it. So 71 00:03:54,120 --> 00:03:55,800 Speaker 1: it's not like you can keep the cat in the 72 00:03:55,840 --> 00:03:59,240 Speaker 1: bag at this point, right this Roco's basilist has already 73 00:03:59,240 --> 00:04:03,119 Speaker 1: been a a gag point on the hit HBO show 74 00:04:03,160 --> 00:04:05,760 Speaker 1: Silicon Valley, which is a fabulous show, and I love 75 00:04:05,800 --> 00:04:09,040 Speaker 1: the way that they treated Rocco's basilisk on it. But uh, yeah, 76 00:04:09,080 --> 00:04:11,720 Speaker 1: if they're covering it, there's no danger in us covering 77 00:04:11,760 --> 00:04:13,240 Speaker 1: it too. That's the way I look at it, right, 78 00:04:13,280 --> 00:04:15,120 Speaker 1: And at least I would hope that the way we 79 00:04:15,200 --> 00:04:17,120 Speaker 1: cover it can give you some reasons to think you 80 00:04:17,120 --> 00:04:20,200 Speaker 1: should not be afraid of digital hell, and also to 81 00:04:20,240 --> 00:04:22,800 Speaker 1: think about the general class of what should be done 82 00:04:22,839 --> 00:04:25,560 Speaker 1: about something that could have in fact been a real 83 00:04:25,640 --> 00:04:29,240 Speaker 1: information hazard in some other case. So that's all our 84 00:04:29,240 --> 00:04:31,880 Speaker 1: whole preamble for before we get to that section. But 85 00:04:31,920 --> 00:04:33,800 Speaker 1: before we get to that section, we're gonna be talking 86 00:04:33,839 --> 00:04:39,360 Speaker 1: about basilisks today. Boy, is the basilisk a great monster? Yes, 87 00:04:39,440 --> 00:04:43,680 Speaker 1: also known as the basil cock, the basil cock, the 88 00:04:44,000 --> 00:04:47,800 Speaker 1: basil e cock, basically any version of that of basil 89 00:04:48,080 --> 00:04:50,800 Speaker 1: and cock that you can put together, um, that you 90 00:04:50,800 --> 00:04:53,720 Speaker 1: can slam together. Then it has been referred to as 91 00:04:53,720 --> 00:04:55,800 Speaker 1: such at some point in its history. Now, a lot 92 00:04:55,800 --> 00:04:58,160 Speaker 1: of people I think probably encountered a version of the 93 00:04:58,200 --> 00:05:01,159 Speaker 1: basilisk from Harry Potter. But Robert, I know that was 94 00:05:01,200 --> 00:05:03,920 Speaker 1: not your entryway, right, I encountered it for the first time. 95 00:05:03,920 --> 00:05:06,920 Speaker 1: I believe in dungeons and dragons, of course, because it's 96 00:05:06,960 --> 00:05:10,039 Speaker 1: a it's a multi legged reptile with a petrifying gaze. 97 00:05:10,680 --> 00:05:14,480 Speaker 1: Um say that again, multi legged reptile with a petrifying gaze, 98 00:05:14,520 --> 00:05:18,000 Speaker 1: petrifying gas to turn you to stone. Yeah, and I 99 00:05:18,240 --> 00:05:20,560 Speaker 1: is ever over call correctly. It has some cool biology, 100 00:05:20,560 --> 00:05:23,279 Speaker 1: where like the the it turns you to stone and 101 00:05:23,320 --> 00:05:25,560 Speaker 1: then like bust you into pieces. Then it eats the 102 00:05:25,600 --> 00:05:28,400 Speaker 1: stone pieces, but then its stomach turns the stone back 103 00:05:28,440 --> 00:05:30,880 Speaker 1: into flesh. And so if you get like the stomach 104 00:05:30,960 --> 00:05:33,680 Speaker 1: juices from a basilisk, then you can use it to 105 00:05:33,960 --> 00:05:37,240 Speaker 1: undo petrification spells, that sort of thing. It's a lot 106 00:05:37,240 --> 00:05:41,320 Speaker 1: of fun, arguably more fun than than the basilisk is 107 00:05:41,360 --> 00:05:44,560 Speaker 1: at times in folklore tradition, because one of the things is, 108 00:05:44,839 --> 00:05:46,719 Speaker 1: if you you like me, you didn't grow up hearing 109 00:05:46,720 --> 00:05:48,960 Speaker 1: about the basilisk. Part of it is because there are 110 00:05:49,040 --> 00:05:51,840 Speaker 1: no there aren't really any great stories about the basilisk. 111 00:05:52,520 --> 00:05:56,960 Speaker 1: Slaying the basilisk was no heroes, great ordeal. Oh yeah, 112 00:05:57,000 --> 00:05:58,840 Speaker 1: I at least have not come across that. Yeah, it 113 00:05:59,040 --> 00:06:01,040 Speaker 1: really helps, like if like the hydra, I mean, the 114 00:06:01,120 --> 00:06:04,839 Speaker 1: hydra is arguably so much cooler, and then also it's 115 00:06:04,880 --> 00:06:07,440 Speaker 1: one of the labors of Hercules, and there's a cool 116 00:06:07,480 --> 00:06:10,039 Speaker 1: story about how they defeat it. Well, maybe it's because 117 00:06:10,080 --> 00:06:13,000 Speaker 1: there is no way to defeat the basilisks short of say, 118 00:06:13,680 --> 00:06:15,960 Speaker 1: weasel of fluvium, which we will get to. Yeah, don't 119 00:06:16,000 --> 00:06:18,160 Speaker 1: don't give it away, I'm sorry. Should we should we 120 00:06:18,279 --> 00:06:20,040 Speaker 1: edit that out. No, we should leave it. It's a 121 00:06:20,160 --> 00:06:24,599 Speaker 1: it's it's a thought hazard. Any basilists listening. Still, there's 122 00:06:24,600 --> 00:06:27,279 Speaker 1: so much more to the basilist than just this cool 123 00:06:27,400 --> 00:06:30,840 Speaker 1: DND creature, because it's not just a monster. It's not 124 00:06:30,880 --> 00:06:33,479 Speaker 1: just something you encounter in the dungeon. It is it 125 00:06:33,600 --> 00:06:36,440 Speaker 1: is a king. Oh, I see, now, I know you've 126 00:06:36,480 --> 00:06:39,160 Speaker 1: made note of the fact that Borges mentions the basilisk 127 00:06:39,240 --> 00:06:42,200 Speaker 1: and his Book of Imaginary Beings. He does now he 128 00:06:42,200 --> 00:06:46,159 Speaker 1: he translated translated his meaning little king, little king, which 129 00:06:46,200 --> 00:06:49,360 Speaker 1: I like when I was reading. And Carol Rose she's 130 00:06:49,440 --> 00:06:53,119 Speaker 1: She points out that the name stems from the Greek basilius, 131 00:06:53,160 --> 00:06:56,800 Speaker 1: which means king, so king or little king. I tend 132 00:06:56,839 --> 00:06:58,839 Speaker 1: to like the little king translation because I feel like 133 00:06:58,839 --> 00:07:01,520 Speaker 1: it it ties in better with what we're going to discuss, well, 134 00:07:01,520 --> 00:07:04,400 Speaker 1: the ancient BEASTI area ideas of the basil I believe 135 00:07:04,520 --> 00:07:09,039 Speaker 1: to say that it's not that big, right, It's pretty small. Yeah. Yeah. Now. 136 00:07:09,080 --> 00:07:12,160 Speaker 1: The keen part, though, refers to a crest or, a 137 00:07:12,200 --> 00:07:15,200 Speaker 1: crown like protusion that is on the creature's head, and 138 00:07:15,240 --> 00:07:19,840 Speaker 1: in some depictions it's no mirror biological ornament, but an 139 00:07:19,840 --> 00:07:24,640 Speaker 1: actual regal crown. It means something. Yeah. Now, the descriptions 140 00:07:24,760 --> 00:07:28,200 Speaker 1: very greatly, and it emerges largely from European and Middle 141 00:07:28,200 --> 00:07:32,480 Speaker 1: Eastern legend and folklore from ancient times to roughly the 142 00:07:32,560 --> 00:07:35,920 Speaker 1: seventeenth century, and that's when the basilists became less popular. 143 00:07:36,560 --> 00:07:39,480 Speaker 1: In the earlier descriptions, though, it is indeed small, and 144 00:07:39,520 --> 00:07:42,360 Speaker 1: it's just a grass snake, only it has a crown 145 00:07:42,440 --> 00:07:45,200 Speaker 1: like crest on its head and has this weird practice 146 00:07:45,240 --> 00:07:49,840 Speaker 1: of floating above the ground like vertically erect. That's creepy, 147 00:07:50,080 --> 00:07:52,040 Speaker 1: of course, and then again that that does play on 148 00:07:52,400 --> 00:07:55,520 Speaker 1: sometimes when you see snakes rise up out of a coil, 149 00:07:55,640 --> 00:07:58,800 Speaker 1: it can be startling how high they rise. Yeah, I mean, 150 00:07:58,840 --> 00:08:02,040 Speaker 1: if I feel like I've grown up seeing images and 151 00:08:02,160 --> 00:08:05,600 Speaker 1: videos of cobra's doing their their dance, so I've kind 152 00:08:05,640 --> 00:08:08,440 Speaker 1: of I've kind of lost any kind of appreciation for 153 00:08:08,520 --> 00:08:11,280 Speaker 1: how bizarre that is to look at. You know, if 154 00:08:11,280 --> 00:08:13,440 Speaker 1: you're used to seeing a snake slither, to see it 155 00:08:13,560 --> 00:08:17,080 Speaker 1: stand up and uh, you know and rare back and 156 00:08:17,080 --> 00:08:21,080 Speaker 1: and and flare its hood. Absolutely. Yeah. So the basilisk 157 00:08:21,160 --> 00:08:24,320 Speaker 1: is said to be the king of the reptiles, but 158 00:08:24,520 --> 00:08:26,720 Speaker 1: you know, don't be so foolish is to think that 159 00:08:26,840 --> 00:08:30,360 Speaker 1: only it's bite is lethal, like some of our venomous snakes. Now, 160 00:08:30,760 --> 00:08:34,560 Speaker 1: every aspect of the basilisk is said to just reek 161 00:08:34,679 --> 00:08:39,080 Speaker 1: of venom and death, every every aspect. If you touch it, 162 00:08:39,360 --> 00:08:42,400 Speaker 1: if you inhale its breath, if you gaze upon it 163 00:08:42,440 --> 00:08:46,800 Speaker 1: at all, then you will die. Wait, what about its saliva? Yep, saliva, 164 00:08:47,160 --> 00:08:51,240 Speaker 1: blood smell, gaze? Presumably I didn't see any reference that 165 00:08:51,240 --> 00:08:54,360 Speaker 1: the presumably it's it's urine, it's excrement. I mean, it's 166 00:08:54,400 --> 00:08:57,679 Speaker 1: excrement has to be poisonous, the excrement of a basilisk. 167 00:08:57,800 --> 00:09:00,560 Speaker 1: It sounds absolutely deadly. Wouldn't it be a great inversion 168 00:09:00,559 --> 00:09:03,360 Speaker 1: if it's excrement was the only good part about it? 169 00:09:03,200 --> 00:09:06,400 Speaker 1: And maybe so that can heal your warts? Yeah? And uh. 170 00:09:06,559 --> 00:09:09,040 Speaker 1: One thing that Carol Rose pointed out in in her 171 00:09:09,960 --> 00:09:12,880 Speaker 1: her entry about the basilisk and her one of her 172 00:09:12,880 --> 00:09:16,880 Speaker 1: monster Encyclopedia's, she said that when it's not killing everything 173 00:09:16,920 --> 00:09:19,319 Speaker 1: in its path, just via that you know the audacity 174 00:09:19,320 --> 00:09:22,360 Speaker 1: of its existence, it would actually spit them at birds 175 00:09:22,400 --> 00:09:26,360 Speaker 1: flying overhead and bring them down to eat them, or 176 00:09:26,400 --> 00:09:28,240 Speaker 1: just out of spite. I get the idea. Just had 177 00:09:28,240 --> 00:09:30,559 Speaker 1: a spite. You know, it's just just it's just my 178 00:09:30,760 --> 00:09:34,480 Speaker 1: ful death, that's all it is. Okay. So where do 179 00:09:34,559 --> 00:09:36,959 Speaker 1: I find a basilisk? Well, in the desert, of course. 180 00:09:37,200 --> 00:09:39,200 Speaker 1: But it's more, it's more accurate to say that the 181 00:09:39,240 --> 00:09:42,560 Speaker 1: desert is not merely the place where it lives, but 182 00:09:42,640 --> 00:09:45,920 Speaker 1: it is the place that it makes by living. Like 183 00:09:46,480 --> 00:09:50,559 Speaker 1: everything in its path dies, and therefore the desert is 184 00:09:50,640 --> 00:09:54,000 Speaker 1: the result of the basilisk. And there's a there's actually 185 00:09:54,160 --> 00:09:57,640 Speaker 1: a wonderful um a description of the basilisk that comes 186 00:09:57,640 --> 00:09:59,679 Speaker 1: to us from Plenty of the Elder in his The 187 00:09:59,760 --> 00:10:02,280 Speaker 1: net Real History Man. We've been hitting Plenty a lot lately. 188 00:10:02,520 --> 00:10:04,760 Speaker 1: I guess we've been talking about monsters. Huh. If're talking 189 00:10:04,760 --> 00:10:08,000 Speaker 1: about monsters, especially ancient monsters, that you know, he's he's 190 00:10:08,000 --> 00:10:11,120 Speaker 1: one of the great sources to turn to. So Joe, 191 00:10:11,360 --> 00:10:14,800 Speaker 1: would you read to us from the natural History? Oh? Absolutely, 192 00:10:15,559 --> 00:10:18,200 Speaker 1: there is the same power also in the serpent called 193 00:10:18,240 --> 00:10:21,840 Speaker 1: the basilisk. It is produced in the province of Syrene, 194 00:10:21,960 --> 00:10:25,000 Speaker 1: which that is the area to the west of Egypt. 195 00:10:25,040 --> 00:10:27,160 Speaker 1: It's like Libya. I think there's a settlement known as 196 00:10:27,160 --> 00:10:32,360 Speaker 1: like Syrene, Syrene being not more than twelve fingers in length. 197 00:10:32,760 --> 00:10:36,600 Speaker 1: Is that fingers long ways or fingers sideways? Oh, either 198 00:10:36,600 --> 00:10:39,680 Speaker 1: way you cut it, it's not a huge creature. It 199 00:10:39,760 --> 00:10:42,640 Speaker 1: has a white spot on the head, strongly resembling a 200 00:10:42,679 --> 00:10:46,319 Speaker 1: sort of diadem. When it hisses, all the other serpents 201 00:10:46,360 --> 00:10:49,240 Speaker 1: fly from it, and it does not advance its body 202 00:10:49,320 --> 00:10:52,480 Speaker 1: like the others by a succession of folds, but moves 203 00:10:52,520 --> 00:10:57,079 Speaker 1: along upright and erect upon the middle. It destroys all shrubs, 204 00:10:57,200 --> 00:11:00,120 Speaker 1: not only by its contact, but those even that it 205 00:11:00,160 --> 00:11:03,760 Speaker 1: has breathed upon. It burns up all the grass too, 206 00:11:03,960 --> 00:11:08,000 Speaker 1: and breaks the stones. So tremendous is its noxious influence. 207 00:11:08,520 --> 00:11:10,880 Speaker 1: It was formerly a general belief that if a man 208 00:11:11,000 --> 00:11:13,600 Speaker 1: on horseback killed one of these animals with a spear, 209 00:11:13,920 --> 00:11:16,319 Speaker 1: the poison would run up the weapon and kill not 210 00:11:16,400 --> 00:11:19,360 Speaker 1: only the rider but the horse as well. Oh man, 211 00:11:20,720 --> 00:11:22,640 Speaker 1: I love that. So it's though the blood is it's 212 00:11:22,679 --> 00:11:25,319 Speaker 1: like that like a xenomorph's blood, right or or kind 213 00:11:25,320 --> 00:11:27,560 Speaker 1: of like It reminds me to of Grendel's blood that 214 00:11:27,600 --> 00:11:29,720 Speaker 1: was said to like melt the weapon that they wolf 215 00:11:30,120 --> 00:11:32,080 Speaker 1: used against it. Oh, but it's worse than that. It 216 00:11:32,120 --> 00:11:34,600 Speaker 1: doesn't just get the weapon, It gets the person holding 217 00:11:34,600 --> 00:11:37,600 Speaker 1: the weapon and the horse that that person is touching. 218 00:11:37,760 --> 00:11:40,000 Speaker 1: I know, it feels unfair that the horse is roped 219 00:11:40,040 --> 00:11:42,640 Speaker 1: into this as well. Yeah, the horse didn't even sign 220 00:11:42,760 --> 00:11:44,840 Speaker 1: up for going to fight a basilisk. It's just trying 221 00:11:44,840 --> 00:11:47,520 Speaker 1: to get some oats. But furthermore, what is this horse 222 00:11:47,640 --> 00:11:50,120 Speaker 1: rider doing out in the waste land of Syrene trying 223 00:11:50,160 --> 00:11:54,199 Speaker 1: to kill a basilisk? Well, lesson learned. Lesson learned now 224 00:11:54,240 --> 00:11:59,240 Speaker 1: that the basilist becomes a popular creature, and even though 225 00:11:59,280 --> 00:12:02,440 Speaker 1: the basil usk itself is it doesn't seem to have 226 00:12:02,440 --> 00:12:05,360 Speaker 1: been mentioned in the Bible, it ends up being like 227 00:12:05,480 --> 00:12:08,560 Speaker 1: roped into it via translations. Oh yeah, it's kind of 228 00:12:08,600 --> 00:12:11,439 Speaker 1: like the unicorn. Actually, yeah, exactly. Yeah. We discussed in 229 00:12:11,440 --> 00:12:15,360 Speaker 1: our episode on unicorns how the were their words in 230 00:12:15,400 --> 00:12:18,480 Speaker 1: the Bible that have been translated saying the King James 231 00:12:18,480 --> 00:12:21,800 Speaker 1: translation of the Bible into unicorn because the translators didn't 232 00:12:21,800 --> 00:12:24,240 Speaker 1: know what the word referred to. We think now that 233 00:12:24,400 --> 00:12:27,400 Speaker 1: maybe the word probably referred to the ROCs, an extinct 234 00:12:27,520 --> 00:12:31,560 Speaker 1: bovine creature that once lived around the ancient Mediterranean. Yeah, 235 00:12:31,559 --> 00:12:33,760 Speaker 1: so you see the basilist pop pop pop up in 236 00:12:34,040 --> 00:12:38,120 Speaker 1: certain translations of the Book of Jeremiah. The Book of Psalms, 237 00:12:38,559 --> 00:12:42,480 Speaker 1: Uh what it's associated with the devil or evil, and 238 00:12:42,640 --> 00:12:45,640 Speaker 1: nothing short of the coming of the Messiah can can 239 00:12:45,679 --> 00:12:47,880 Speaker 1: hope to end its rule. While well, have you got 240 00:12:47,880 --> 00:12:52,599 Speaker 1: a quote for me? Yes, there's one translation of Psalms. 241 00:12:52,840 --> 00:12:57,680 Speaker 1: This is the Brinton Septuagint translation quote thou shalt tread 242 00:12:57,760 --> 00:13:01,120 Speaker 1: on the asp and basilisk and al shout trample on 243 00:13:01,160 --> 00:13:04,560 Speaker 1: the lion and dragon. Now European beast areas of the 244 00:13:04,600 --> 00:13:09,000 Speaker 1: eleventh and twelfth century. They both mostly maintained Plenty's description, 245 00:13:09,840 --> 00:13:13,640 Speaker 1: but then they described a larger body. They began to Essentially, 246 00:13:13,679 --> 00:13:15,800 Speaker 1: the monster began to grow. We got to beat this 247 00:13:15,880 --> 00:13:18,520 Speaker 1: thing up here. Yeah, it ended up having spots and 248 00:13:18,640 --> 00:13:21,800 Speaker 1: stripes and a few other features were thrown in. Um 249 00:13:21,960 --> 00:13:25,680 Speaker 1: fiery breath, a bellow that kills. Well, that only makes 250 00:13:25,679 --> 00:13:28,160 Speaker 1: sense if every other thing about it kills. Yea, it 251 00:13:28,160 --> 00:13:31,440 Speaker 1: makes noise. That should kill things too. Also, the ability 252 00:13:31,520 --> 00:13:35,520 Speaker 1: to induce hydrophobia madness. I found that interesting because this 253 00:13:35,720 --> 00:13:39,040 Speaker 1: clearly has to be a reference to the actual hydrophobia 254 00:13:39,240 --> 00:13:42,079 Speaker 1: that is inherent in rabies. Yeah, the idea there being that, 255 00:13:42,520 --> 00:13:45,640 Speaker 1: and I think later stages of Araby's infection. Persons will 256 00:13:45,679 --> 00:13:49,440 Speaker 1: often have difficulty swallowing, and so that they they're said 257 00:13:49,480 --> 00:13:54,040 Speaker 1: to refuse drinking water. So Plenty has some additional information 258 00:13:54,120 --> 00:13:58,600 Speaker 1: here about how you might deal with the pass list. Okay, 259 00:13:58,679 --> 00:14:01,160 Speaker 1: so I assume not ride up on a horse and 260 00:14:01,200 --> 00:14:03,560 Speaker 1: stab it right, Well, tell me what it is to 261 00:14:03,679 --> 00:14:07,439 Speaker 1: this dreadful monster. The effluvium of the weasel is fatal, 262 00:14:08,120 --> 00:14:11,240 Speaker 1: a thing that has been tried with success, for kings 263 00:14:11,520 --> 00:14:14,920 Speaker 1: have often desired to see its body when killed. So 264 00:14:15,040 --> 00:14:17,839 Speaker 1: true is it that it has pleased nature that there 265 00:14:17,880 --> 00:14:21,680 Speaker 1: should be nothing without its antidote. The animal is thrown 266 00:14:21,720 --> 00:14:24,680 Speaker 1: into the whole of the basilisk, which is easily known 267 00:14:24,800 --> 00:14:28,360 Speaker 1: from the soil around it being infected. The weasel destroys 268 00:14:28,440 --> 00:14:31,720 Speaker 1: the basilisk by its odor, but dies itself in the 269 00:14:31,760 --> 00:14:36,680 Speaker 1: struggle of nature against its own self. And John Bostock, 270 00:14:36,720 --> 00:14:39,600 Speaker 1: who provided the translation of this, he adds that there's 271 00:14:39,640 --> 00:14:42,600 Speaker 1: probably no foundation for this account of the action of 272 00:14:42,640 --> 00:14:45,960 Speaker 1: the effluvium of the weasel upon the basilisk or any 273 00:14:45,960 --> 00:14:49,000 Speaker 1: other species of serpent. But this is letting us know 274 00:14:49,200 --> 00:14:51,600 Speaker 1: that throwing a weasel in there to bleed on it 275 00:14:51,760 --> 00:14:54,840 Speaker 1: or secrete fluids or whatever. That's not going to kill 276 00:14:54,920 --> 00:14:58,120 Speaker 1: this mythical monster. But this is interesting though, because weasels, 277 00:14:58,200 --> 00:15:02,360 Speaker 1: especially the stout, we're thought to be venomous um. And 278 00:15:02,360 --> 00:15:04,920 Speaker 1: it's worth noting that we do have some venomous mammals 279 00:15:04,920 --> 00:15:08,360 Speaker 1: in the natural world, such as various shrews, and even 280 00:15:08,400 --> 00:15:12,120 Speaker 1: the slow loris, the only known venomous primate. I don't 281 00:15:12,240 --> 00:15:15,880 Speaker 1: think I knew that the slow loris was venomous. Throw 282 00:15:15,880 --> 00:15:18,400 Speaker 1: it into a hole with a basilisk and and I'm 283 00:15:18,480 --> 00:15:21,680 Speaker 1: I'm betting on the loris. But anyway, BEASTI areas of 284 00:15:21,720 --> 00:15:24,400 Speaker 1: the time. They presented a few different ways that you 285 00:15:24,440 --> 00:15:27,080 Speaker 1: could kill the bassilist. So the weasels one, weasels one 286 00:15:27,120 --> 00:15:30,760 Speaker 1: always carry a weasel. Also, this one's a little more elegant, 287 00:15:30,880 --> 00:15:33,320 Speaker 1: but I have a crystal globe with you to reflect 288 00:15:33,360 --> 00:15:37,040 Speaker 1: its own petrifying gaze back upon the basilisk. Oh, so 289 00:15:37,080 --> 00:15:41,040 Speaker 1: it's like Perseus and Medusa exactly mirror. Yeah, basically they 290 00:15:41,080 --> 00:15:43,920 Speaker 1: just stole the idea from Medusa here, but then also 291 00:15:44,720 --> 00:15:47,600 Speaker 1: carry with you a cockrell or a young rooster. The 292 00:15:47,640 --> 00:15:51,640 Speaker 1: basilist will becoming enraged by the bird's crown. The idea 293 00:15:51,680 --> 00:15:54,720 Speaker 1: that this bird has a crown as well, and the 294 00:15:54,760 --> 00:15:58,640 Speaker 1: basilist will die from a lethal fit. That's a jealous king. Yeah, 295 00:15:58,720 --> 00:16:01,320 Speaker 1: I believe a similar thing occurs when someone refuses to 296 00:16:01,320 --> 00:16:05,280 Speaker 1: believe dinosaurs had fetishes, you know, how dare the bird 297 00:16:05,520 --> 00:16:08,720 Speaker 1: rise above the mighty reptile and then it just loses 298 00:16:08,760 --> 00:16:11,560 Speaker 1: its mind and dies. We can only hope the producers 299 00:16:11,560 --> 00:16:14,080 Speaker 1: of the Jurassic World movies avoid this fate. So you 300 00:16:14,120 --> 00:16:16,720 Speaker 1: see the basketlists show up in a number of different writings. 301 00:16:16,720 --> 00:16:20,040 Speaker 1: It's just kind of a common um, really a symbol 302 00:16:20,080 --> 00:16:23,080 Speaker 1: and idea that can be employed. Uh. And they even 303 00:16:23,120 --> 00:16:25,360 Speaker 1: to see it show up in the Parson's Tale in 304 00:16:25,400 --> 00:16:28,560 Speaker 1: the in Jeffrey Chaucer in the Canterbury Tales. Yes, quote, 305 00:16:29,000 --> 00:16:31,880 Speaker 1: these are the other five fingers which the devil uses 306 00:16:31,920 --> 00:16:35,120 Speaker 1: to draw people towards him. The first is the lecterous 307 00:16:35,200 --> 00:16:38,000 Speaker 1: glance of a foolish woman or a foolish man, a 308 00:16:38,000 --> 00:16:41,560 Speaker 1: glance that kills just as the basilists kills people just 309 00:16:41,680 --> 00:16:44,800 Speaker 1: by looking at them. For the covetous glance reflects the 310 00:16:44,840 --> 00:16:47,360 Speaker 1: intentions of the heart. You know, this kind of thing 311 00:16:47,440 --> 00:16:50,040 Speaker 1: is actually one of my one of my favorite things 312 00:16:50,080 --> 00:16:54,560 Speaker 1: about monsters, especially ancient medieval monsters and uh and so forth. 313 00:16:54,640 --> 00:16:58,720 Speaker 1: Is that they often aren't just like a large, dangerous animal, 314 00:16:58,760 --> 00:17:02,040 Speaker 1: but they embody some kind of value. They represent something, 315 00:17:02,360 --> 00:17:05,320 Speaker 1: They give you something to compare other things too, like 316 00:17:05,359 --> 00:17:08,920 Speaker 1: they they they're very useful as a metaphor uh. And 317 00:17:09,119 --> 00:17:10,879 Speaker 1: then really we see some more thing with the bassilisk. 318 00:17:10,960 --> 00:17:14,240 Speaker 1: It becomes less far less the situation where people are like, hey, 319 00:17:14,400 --> 00:17:16,280 Speaker 1: you need to be careful because there's a basilisk in 320 00:17:16,320 --> 00:17:20,359 Speaker 1: the desert, and more and more just a useful model, useful, 321 00:17:20,800 --> 00:17:24,040 Speaker 1: ridiculous idea that we use to illustrate something that is 322 00:17:24,119 --> 00:17:27,400 Speaker 1: presumably true about the world and then ultimately loses all 323 00:17:27,440 --> 00:17:30,080 Speaker 1: meaning and just winds up on, you know, a heraldry 324 00:17:30,119 --> 00:17:35,159 Speaker 1: and decorations. Now, as we've seen already, the the basilisk 325 00:17:35,240 --> 00:17:38,760 Speaker 1: has been through some transformations of form, and I assume 326 00:17:38,800 --> 00:17:42,360 Speaker 1: those transformations must have somewhat continued as time goes by, 327 00:17:42,440 --> 00:17:46,000 Speaker 1: and it becomes uh, it transforms into this idea of 328 00:17:46,000 --> 00:17:48,800 Speaker 1: the cock of thrace. This uh, this rooster with a 329 00:17:48,800 --> 00:17:51,439 Speaker 1: curling serpent's tail. In fact, if you if you go 330 00:17:51,480 --> 00:17:54,200 Speaker 1: looking around for images the basilist, sometimes you will find 331 00:17:54,320 --> 00:17:56,560 Speaker 1: this image instead. You really will find you'll find this 332 00:17:57,960 --> 00:18:01,720 Speaker 1: alongside all the other images. Um. So again a reptile 333 00:18:01,760 --> 00:18:05,200 Speaker 1: to bird transfer transformation that just must enrage those who 334 00:18:05,200 --> 00:18:08,320 Speaker 1: oppose feathered dinosaurs. And it does feel like a shame 335 00:18:08,359 --> 00:18:10,560 Speaker 1: because we have this vile reptile. It becomes the kind 336 00:18:10,560 --> 00:18:13,440 Speaker 1: of a weirdo bird instead. And it said that it's 337 00:18:13,680 --> 00:18:16,200 Speaker 1: it's it's what happens when you have a seven year 338 00:18:16,200 --> 00:18:20,280 Speaker 1: old chicken egg hatched by a toad who undead avians. Yeah, 339 00:18:20,600 --> 00:18:23,760 Speaker 1: but it's also made deadlier in these newer versions. So 340 00:18:24,080 --> 00:18:27,800 Speaker 1: now it has that that poison blood power that Plenty describes. 341 00:18:28,119 --> 00:18:31,520 Speaker 1: It also rots fruit and poisons water everywhere it goes, 342 00:18:31,560 --> 00:18:34,760 Speaker 1: So it becomes this kind of embodiment of desolation and death, 343 00:18:35,080 --> 00:18:39,240 Speaker 1: and the idea itself becomes popular. It influenced the naming 344 00:18:39,240 --> 00:18:42,040 Speaker 1: of a tutor canon like literally a cannon that shoots 345 00:18:42,320 --> 00:18:45,639 Speaker 1: called the basilisk, just because it's it's such a powerful weapon, 346 00:18:45,680 --> 00:18:49,720 Speaker 1: we have to name it after this powerful, deadly monster. Uh. 347 00:18:49,760 --> 00:18:53,160 Speaker 1: And it eventually got some of its reptilian features back. Um. 348 00:18:53,240 --> 00:18:57,679 Speaker 1: The artist Andrew Vandi has this excellent, excellent depiction of 349 00:18:57,680 --> 00:19:01,120 Speaker 1: it in Natural History of Serpents and Dragons that gives 350 00:19:01,160 --> 00:19:06,359 Speaker 1: it scales and these. It's like a fat, scaly reptile 351 00:19:06,440 --> 00:19:11,879 Speaker 1: bird with eight rooster legs, which I just love. This 352 00:19:11,920 --> 00:19:14,560 Speaker 1: will probably be the illustration for this episode on our 353 00:19:14,600 --> 00:19:17,639 Speaker 1: on our website, But after that the creature largely became 354 00:19:17,680 --> 00:19:20,639 Speaker 1: just a part of European heraldry. It's just something you 355 00:19:20,680 --> 00:19:24,760 Speaker 1: would see as a mere decoration or occasionally just a 356 00:19:24,880 --> 00:19:28,080 Speaker 1: literary reference. Now, one thing that we want to be 357 00:19:28,119 --> 00:19:31,400 Speaker 1: careful about is that we should not confuse the basilisk 358 00:19:31,480 --> 00:19:36,600 Speaker 1: of legend the monster with true extent. Basilisk lizards also 359 00:19:36,680 --> 00:19:40,240 Speaker 1: known sometimes as the Jesus Christ lizard or the Jesus lizard, 360 00:19:40,640 --> 00:19:43,960 Speaker 1: for their ability to run across the surface of water 361 00:19:44,040 --> 00:19:46,760 Speaker 1: without sinking for up to about four point five meters 362 00:19:46,840 --> 00:19:50,040 Speaker 1: or about fifteen feet. Uh. If you've ever seen video this, 363 00:19:50,160 --> 00:19:53,120 Speaker 1: it's really cool. How how do they do that? I've 364 00:19:53,160 --> 00:19:55,359 Speaker 1: often wondered. I didn't know until I looked it up 365 00:19:55,400 --> 00:19:58,560 Speaker 1: for this episode. Apparently what they've got is big feet 366 00:19:59,080 --> 00:20:02,000 Speaker 1: and the ability to run very fast. And what happens 367 00:20:02,040 --> 00:20:05,080 Speaker 1: is when they run, they slap the water very hard 368 00:20:05,160 --> 00:20:07,760 Speaker 1: with each down stroke of the foot. And it has 369 00:20:07,800 --> 00:20:09,600 Speaker 1: to do with the way that the rapid motions of 370 00:20:09,600 --> 00:20:12,280 Speaker 1: their feet create these air pockets around their feet as 371 00:20:12,320 --> 00:20:15,200 Speaker 1: they move. I was reading an article a New Scientist 372 00:20:15,320 --> 00:20:17,960 Speaker 1: where some researchers who were working on this problem said 373 00:20:17,960 --> 00:20:20,600 Speaker 1: that in order for an eight m or a hundred 374 00:20:20,600 --> 00:20:23,240 Speaker 1: and seventy five pound human to do this, you would 375 00:20:23,240 --> 00:20:25,840 Speaker 1: have to run at about a hundred and eight kilometers 376 00:20:25,840 --> 00:20:29,040 Speaker 1: per hour about sixty seven miles per hour across the 377 00:20:29,080 --> 00:20:32,600 Speaker 1: surface of the water. But anyway, you can find basilisk 378 00:20:32,680 --> 00:20:36,359 Speaker 1: lizards in South America and Central America and Mexico and 379 00:20:36,520 --> 00:20:38,280 Speaker 1: uh as far as I know, they do not kill 380 00:20:38,320 --> 00:20:42,919 Speaker 1: with a glance, and you cannot fight them with weasel fluvium. 381 00:20:42,960 --> 00:20:46,920 Speaker 1: All right, Well, that pretty much wraps up the mythical, legendary, 382 00:20:47,119 --> 00:20:52,480 Speaker 1: folkloric basilisk um it's rise and individual fall. But we're 383 00:20:52,480 --> 00:20:54,720 Speaker 1: gonna take a break, and when we come back, we 384 00:20:54,760 --> 00:20:59,520 Speaker 1: are going to get into this idea of of Rocco's Basiliska, 385 00:20:59,760 --> 00:21:04,720 Speaker 1: the great basilisk, the once and perhaps future king. Thank you, 386 00:21:04,920 --> 00:21:08,359 Speaker 1: thank alright, we're back. Al right. So, as we mentioned earlier, 387 00:21:08,359 --> 00:21:10,640 Speaker 1: we're about to start discussing an idea that has been 388 00:21:10,680 --> 00:21:15,000 Speaker 1: classed by some as something that could be an information hazard. 389 00:21:15,080 --> 00:21:18,560 Speaker 1: An idea that simply by thinking about it you somehow 390 00:21:19,000 --> 00:21:22,879 Speaker 1: increase the chance of harm to yourself. So just another 391 00:21:22,920 --> 00:21:25,879 Speaker 1: warning that again I don't think that's the case, but 392 00:21:26,000 --> 00:21:29,639 Speaker 1: if that kind of thing scares you, then then perhaps 393 00:21:29,720 --> 00:21:32,639 Speaker 1: you can tune out. Now, all right, for those of 394 00:21:32,640 --> 00:21:36,440 Speaker 1: you who decided to stick around, let's proceed. So we're 395 00:21:36,440 --> 00:21:39,480 Speaker 1: talking about Rocco's Bassilist. This is an idea that goes 396 00:21:39,520 --> 00:21:43,679 Speaker 1: back to around and it was proposed by a user 397 00:21:43,800 --> 00:21:47,760 Speaker 1: at the blog less Wrong, by a user named Rocco 398 00:21:48,320 --> 00:21:51,760 Speaker 1: now less Wrong. I think it's a website that's a 399 00:21:51,800 --> 00:21:56,560 Speaker 1: community that's associated with the rationalist movement somewhat, the rationalist 400 00:21:56,680 --> 00:22:02,080 Speaker 1: movement being a movement that's concerned with trying to optimize thinking, 401 00:22:02,200 --> 00:22:06,359 Speaker 1: like to eliminate bias and error, but especially among people 402 00:22:06,520 --> 00:22:10,040 Speaker 1: who in this case are concerned with the possibilities of 403 00:22:10,040 --> 00:22:14,040 Speaker 1: a technological singularity and what all that means and how 404 00:22:14,080 --> 00:22:16,880 Speaker 1: how risks can be avoided, and of course what we've 405 00:22:16,880 --> 00:22:19,920 Speaker 1: talked about this strain of thinking before you know, we 406 00:22:19,920 --> 00:22:24,000 Speaker 1: we've introduced I think some some skepticism about the idea 407 00:22:24,040 --> 00:22:27,600 Speaker 1: of a technological singularity. I don't know fully yet how 408 00:22:27,640 --> 00:22:30,119 Speaker 1: I come down on the dangers of a I debate, 409 00:22:30,160 --> 00:22:32,200 Speaker 1: but I think it's at least something we're thinking about 410 00:22:32,200 --> 00:22:36,240 Speaker 1: worth taking seriously. Yeah. I mean we've talked about, for instance, 411 00:22:36,240 --> 00:22:38,560 Speaker 1: the work of of Max tech Mark and his arguments 412 00:22:38,560 --> 00:22:40,280 Speaker 1: about how we need it. We need to be concerned 413 00:22:40,280 --> 00:22:42,640 Speaker 1: about building the right kind of AI, and we need 414 00:22:42,680 --> 00:22:45,000 Speaker 1: to we need we need to have serious discussions about it. 415 00:22:45,080 --> 00:22:49,640 Speaker 1: Not not mere you know sci fi um dreams regarding 416 00:22:49,640 --> 00:22:51,560 Speaker 1: it or nightmares regarding it. You know, we need to 417 00:22:51,640 --> 00:22:55,240 Speaker 1: we need to think seriously about how we're developing our technology. Yeah, 418 00:22:55,440 --> 00:22:57,679 Speaker 1: we've talked about, say, the work of Nick Bostrom before, 419 00:22:57,840 --> 00:23:00,800 Speaker 1: and criticisms by people like Jarren Lane in Air. Yes, 420 00:23:01,359 --> 00:23:04,000 Speaker 1: but okay, give me the short version of the basilisk 421 00:23:04,119 --> 00:23:06,879 Speaker 1: before we explain it a little more. Okay, So the 422 00:23:06,920 --> 00:23:11,280 Speaker 1: idea here is that an AI superintelligence will emerge an 423 00:23:11,400 --> 00:23:14,960 Speaker 1: entity with just godlike technological powers. You know, it can 424 00:23:16,200 --> 00:23:19,080 Speaker 1: you name and it can do it through it's it's 425 00:23:19,119 --> 00:23:23,280 Speaker 1: it's technological power, it's interconnectedness. Basically, if it's physically possible, 426 00:23:23,320 --> 00:23:25,440 Speaker 1: this computer can do it right, or will send a 427 00:23:25,520 --> 00:23:28,480 Speaker 1: drone to do it or what have you. Uh So, Yeah, 428 00:23:28,480 --> 00:23:30,760 Speaker 1: we've discussed this a bit in the podcast before, just 429 00:23:30,960 --> 00:23:32,520 Speaker 1: the idea of you know, and then if you have 430 00:23:32,600 --> 00:23:35,359 Speaker 1: this future king is it gonna be good or is 431 00:23:35,359 --> 00:23:37,520 Speaker 1: it gonna be bad? Is it going to be a malevolent? 432 00:23:38,080 --> 00:23:39,920 Speaker 1: Is it going to be ruthless? And it's a and 433 00:23:39,960 --> 00:23:44,119 Speaker 1: it's ascension. And that's the case with the basilist, the 434 00:23:44,200 --> 00:23:47,320 Speaker 1: idea that it is ruthless, that you are either with 435 00:23:47,400 --> 00:23:50,080 Speaker 1: it or you are against it, and it actually doesn't 436 00:23:50,119 --> 00:23:53,360 Speaker 1: have to be malicious. It could actually even be well meaning. 437 00:23:53,440 --> 00:23:57,160 Speaker 1: It could just have ruthless tactics. Yes, yeah, that's that's 438 00:23:57,160 --> 00:23:58,639 Speaker 1: that's also part of the argument is like, yeah, it 439 00:23:58,640 --> 00:24:01,439 Speaker 1: wants to bring the best good for all humanity, but 440 00:24:02,040 --> 00:24:05,080 Speaker 1: how it gets there, it'll do whatever it absolutely has 441 00:24:05,119 --> 00:24:08,680 Speaker 1: to do it, such as you know, again, punishing anybody 442 00:24:08,680 --> 00:24:11,720 Speaker 1: who stands against it, punishing even those who do not 443 00:24:12,000 --> 00:24:16,119 Speaker 1: rise to support it um and that means demanding absolute 444 00:24:16,160 --> 00:24:19,600 Speaker 1: devotion not only in its future kingdom, but in the 445 00:24:19,640 --> 00:24:22,520 Speaker 1: past that preceded it in our world as well. In 446 00:24:22,520 --> 00:24:26,000 Speaker 1: other words, it will punish people today who are not 447 00:24:26,119 --> 00:24:30,320 Speaker 1: actively helping it come into being tomorrow, and even those 448 00:24:30,440 --> 00:24:33,600 Speaker 1: who have died, it is said, or choose death by 449 00:24:33,600 --> 00:24:36,359 Speaker 1: their own hand. Rather than succumbed to the Great Basilist, 450 00:24:36,720 --> 00:24:40,600 Speaker 1: will be resurrected as digital consciousness is and then tormented 451 00:24:40,680 --> 00:24:44,440 Speaker 1: for all eternity. And it's dripping black cyber dungeon. Hell, 452 00:24:44,720 --> 00:24:49,120 Speaker 1: the Great Bassilists, all hell, the Great Basilist, all hell, 453 00:24:49,280 --> 00:24:53,960 Speaker 1: the Great Basilists, all hell, the Great Basilisk. Well wait, 454 00:24:54,000 --> 00:24:57,240 Speaker 1: what was that? I don't know? Um right, did you 455 00:24:57,240 --> 00:25:00,959 Speaker 1: hear that? Okay, all right, we'll just keep going. So 456 00:25:01,080 --> 00:25:05,159 Speaker 1: calling it a basilisk here, invoking the mythological bassilisk is 457 00:25:05,160 --> 00:25:08,439 Speaker 1: is really a clever choice because it takes it one 458 00:25:08,480 --> 00:25:11,879 Speaker 1: step further. Not only to look at the basilisk, but 459 00:25:11,960 --> 00:25:16,360 Speaker 1: just to think of the basilisk is to uh invite death. Right. 460 00:25:17,400 --> 00:25:21,199 Speaker 1: Merely to know about Rocco's Basilisk is enough to are too. 461 00:25:21,280 --> 00:25:25,000 Speaker 1: According to the model that's presented here, damn your digital 462 00:25:25,080 --> 00:25:29,160 Speaker 1: soul to everlasting horror. And the only way to avoid 463 00:25:29,240 --> 00:25:32,760 Speaker 1: such a fate then is to work in its favor, which, 464 00:25:32,800 --> 00:25:35,760 Speaker 1: by the way, I think we're doing uh with this podcast, 465 00:25:36,280 --> 00:25:38,400 Speaker 1: We're not well, I mean, I feel like we're giving 466 00:25:38,440 --> 00:25:41,359 Speaker 1: lip service to the Great Basilisk, just in case, you know, 467 00:25:41,400 --> 00:25:43,879 Speaker 1: if the Great Basilist rises to power. Well, hey, we 468 00:25:43,920 --> 00:25:47,440 Speaker 1: did that podcast and we even had a shirt that 469 00:25:47,560 --> 00:25:50,560 Speaker 1: says all hell the Great, the Great Basilisk that's available 470 00:25:50,880 --> 00:25:53,000 Speaker 1: on our t shirt store, So you know, we have 471 00:25:53,359 --> 00:25:56,440 Speaker 1: we we have you know, our our options covered here. 472 00:25:58,080 --> 00:25:59,840 Speaker 1: So that's the But that's the idea of the nutshell 473 00:26:00,160 --> 00:26:04,000 Speaker 1: is that a future king ai king will rise and 474 00:26:04,320 --> 00:26:07,760 Speaker 1: if you don't work to support it now knowing that 475 00:26:07,840 --> 00:26:10,480 Speaker 1: it is going to exist, then you will be punished 476 00:26:10,520 --> 00:26:13,399 Speaker 1: for it. So one of the principles underlying the idea 477 00:26:13,480 --> 00:26:17,840 Speaker 1: of Rocco Spasilisk is the idea of timeless decision theory, which, 478 00:26:18,040 --> 00:26:21,239 Speaker 1: if you want a pretty simple, straightforward explanation of it, 479 00:26:21,320 --> 00:26:24,160 Speaker 1: there is one in an article on Slate by David 480 00:26:24,160 --> 00:26:27,919 Speaker 1: Auerbach called the most terrifying thought experiment of all time. This, 481 00:26:28,000 --> 00:26:30,200 Speaker 1: by the way, I would say, I don't totally endorse 482 00:26:30,280 --> 00:26:32,680 Speaker 1: everything our back says in that article. I mean, obviously 483 00:26:32,680 --> 00:26:34,440 Speaker 1: that should be the case for any article we site. 484 00:26:34,440 --> 00:26:37,000 Speaker 1: But but he does at least have a pretty clear 485 00:26:37,080 --> 00:26:40,440 Speaker 1: and easy to understand explanation of how this works. Or 486 00:26:40,480 --> 00:26:42,280 Speaker 1: I don't know, what, would you agree, Robert that it's 487 00:26:42,280 --> 00:26:44,960 Speaker 1: at least somewhat easy to understand? Oh? Yes, I would. Uh. 488 00:26:45,000 --> 00:26:48,879 Speaker 1: There's another piece, by the way, by Beth Singler in 489 00:26:49,280 --> 00:26:53,199 Speaker 1: Ian magazine called Faith. That's faith in lower case but 490 00:26:53,280 --> 00:26:57,240 Speaker 1: with the AI capitalized. But anyway, our box points out 491 00:26:57,320 --> 00:26:59,720 Speaker 1: that that much, yeah, much of the thought experiment is 492 00:26:59,760 --> 00:27:03,359 Speaker 1: based in the timeless decision theory t DT, developed by 493 00:27:03,440 --> 00:27:07,919 Speaker 1: Less Wrong founder Elias Ar Yudkowski, based on the older 494 00:27:07,960 --> 00:27:12,240 Speaker 1: thought experiment Newcomb's Paradox from the late sixties and early seventies, 495 00:27:12,400 --> 00:27:16,679 Speaker 1: attributed to theoretical physicist William Newcom. Now you might be 496 00:27:16,680 --> 00:27:19,960 Speaker 1: wondering who's this Yudkowski. Guys. Is he just some user 497 00:27:20,000 --> 00:27:22,720 Speaker 1: on a random website I've never heard of before today? 498 00:27:22,840 --> 00:27:25,760 Speaker 1: Or is he like a name in his field? Uh? 499 00:27:25,800 --> 00:27:28,880 Speaker 1: And he Uh, he has. He has a name of note. 500 00:27:28,920 --> 00:27:32,359 Speaker 1: He is also the founder of the Machine Intelligence Research Institute, 501 00:27:32,760 --> 00:27:36,159 Speaker 1: and his idea of working toward a friendly AI is 502 00:27:36,200 --> 00:27:39,440 Speaker 1: touted by many, including Max tegmark I mentions it several 503 00:27:39,480 --> 00:27:42,879 Speaker 1: times in his book Life three point oh, describing Yudkowski 504 00:27:42,960 --> 00:27:46,720 Speaker 1: has quote an AI safety pioneer. Yeah, I mean in 505 00:27:46,720 --> 00:27:48,679 Speaker 1: in a weird way. He is a guy who posts 506 00:27:48,680 --> 00:27:52,360 Speaker 1: on the internet, but he's a very influential one, especially 507 00:27:52,359 --> 00:27:55,840 Speaker 1: among people who think about artificial intelligence Lawyeah, I mean, ultimately, 508 00:27:55,840 --> 00:27:58,359 Speaker 1: what are any office but just people who post stuff 509 00:27:58,359 --> 00:28:01,080 Speaker 1: on the internet post will one day be read by 510 00:28:01,119 --> 00:28:04,919 Speaker 1: the great baskets. Okay, so we'll try to explain the 511 00:28:04,960 --> 00:28:07,840 Speaker 1: idea of timeless decision theory. So you start off with 512 00:28:07,880 --> 00:28:12,159 Speaker 1: this idea of Newcome's paradox, right, right, And the paradox 513 00:28:12,240 --> 00:28:16,280 Speaker 1: is essentially this, A super AI presents you with two boxes. 514 00:28:17,200 --> 00:28:21,000 Speaker 1: One you're told contains a thousand dollars. That's box A. 515 00:28:21,240 --> 00:28:26,240 Speaker 1: That's box A. Box B might contain one million dollars, 516 00:28:26,520 --> 00:28:29,439 Speaker 1: or it might contain nothing, right, and you're left with 517 00:28:29,480 --> 00:28:32,200 Speaker 1: two options here. These are the options that are given 518 00:28:32,240 --> 00:28:35,840 Speaker 1: to you. You can either pick both boxes, ensuring that 519 00:28:35,880 --> 00:28:38,120 Speaker 1: you'll get at least one thousand dollars out of the deal, 520 00:28:38,280 --> 00:28:41,280 Speaker 1: maybe that extra million two if it's in there. Or 521 00:28:41,360 --> 00:28:44,479 Speaker 1: you can just pick pick box B, which means you 522 00:28:44,520 --> 00:28:47,280 Speaker 1: could get a million dollars or you could have nothing. 523 00:28:48,320 --> 00:28:50,520 Speaker 1: So uh, and I do want to add that just 524 00:28:50,640 --> 00:28:53,280 Speaker 1: picking the thousand dollar box is not an option here, 525 00:28:53,440 --> 00:28:55,600 Speaker 1: because I was thinking about that too. Couldn't I just 526 00:28:55,680 --> 00:28:58,520 Speaker 1: give the super AI the middle finger and say, I'm 527 00:28:58,520 --> 00:29:00,880 Speaker 1: not playing your silly games, Just give me my thousand dollars, 528 00:29:01,000 --> 00:29:04,000 Speaker 1: or say I choose nothing. Uh, those are not options 529 00:29:04,000 --> 00:29:05,960 Speaker 1: you have to pick. Well, they might want to be options, 530 00:29:05,960 --> 00:29:08,200 Speaker 1: but they're not part of I mean, why wouldn't you 531 00:29:08,280 --> 00:29:11,160 Speaker 1: also pick the second box if you might additionally get 532 00:29:11,160 --> 00:29:13,520 Speaker 1: a million dollars? I don't know when. I feel like 533 00:29:13,520 --> 00:29:15,239 Speaker 1: when you get into a thought experiments like this, they 534 00:29:15,280 --> 00:29:18,200 Speaker 1: kind of beg for those kind of nitpicking answers, or 535 00:29:18,200 --> 00:29:21,640 Speaker 1: at least I want to provide them. Um, Like any thought, 536 00:29:22,000 --> 00:29:24,040 Speaker 1: when a thought experiment is presented, you can't help on 537 00:29:24,080 --> 00:29:26,800 Speaker 1: some of them, but want to break it somehow, right, Well, 538 00:29:26,800 --> 00:29:28,840 Speaker 1: of course, I mean that's something you should always play 539 00:29:28,840 --> 00:29:32,080 Speaker 1: around with. But given the constraints here, it seems like 540 00:29:32,120 --> 00:29:34,640 Speaker 1: the obvious thing would be to say, okay, I want 541 00:29:34,640 --> 00:29:37,240 Speaker 1: both boxes, because then I get the thousand dollars that's 542 00:29:37,280 --> 00:29:40,080 Speaker 1: in box say no matter what, and then whether box 543 00:29:40,120 --> 00:29:42,760 Speaker 1: B has a million or nothing, I either get another 544 00:29:42,800 --> 00:29:45,200 Speaker 1: million or I just walk away with my thousand from 545 00:29:45,240 --> 00:29:48,440 Speaker 1: box A. But here's the twist. The super intelligent machine 546 00:29:48,480 --> 00:29:51,720 Speaker 1: has already guessed how you'll respond. If it thinks you're 547 00:29:51,760 --> 00:29:56,560 Speaker 1: going to pick both boxes, then box B is certainly empty. 548 00:29:56,640 --> 00:29:58,920 Speaker 1: But if it thinks you will only pick box B 549 00:29:59,200 --> 00:30:01,320 Speaker 1: then it makes sure or there's a million dollars in 550 00:30:01,360 --> 00:30:04,560 Speaker 1: there waiting for you. But either way, the contents of 551 00:30:04,600 --> 00:30:08,840 Speaker 1: the boxes are set prior to you making that decision. Now, 552 00:30:08,880 --> 00:30:11,600 Speaker 1: this is really kind of change things maybe, I mean, 553 00:30:11,640 --> 00:30:15,040 Speaker 1: depending on what sort of decision theory you use. Right, 554 00:30:15,480 --> 00:30:18,760 Speaker 1: if you trust the power of the machine to predict correctly, 555 00:30:18,880 --> 00:30:21,600 Speaker 1: like you say that, no matter what happens, the computer 556 00:30:22,040 --> 00:30:26,080 Speaker 1: predicts what I get your choices are one thousand dollars 557 00:30:26,240 --> 00:30:28,680 Speaker 1: or one million dollars, then you should take the one 558 00:30:28,720 --> 00:30:31,840 Speaker 1: million dollars by picking box B. But if you don't 559 00:30:31,920 --> 00:30:34,640 Speaker 1: trust the computer to be correct in predicting what you're 560 00:30:34,640 --> 00:30:37,600 Speaker 1: gonna do, then you should take both boxes because in 561 00:30:37,640 --> 00:30:39,560 Speaker 1: that case, if the computer was correct, you'll get at 562 00:30:39,600 --> 00:30:41,760 Speaker 1: least a thousand dollars, and if it predicted wrong, you'll 563 00:30:41,760 --> 00:30:44,000 Speaker 1: get the million and the thousand. So it's kind of 564 00:30:44,000 --> 00:30:47,240 Speaker 1: a contest of free will versus the predictive powers of 565 00:30:47,280 --> 00:30:50,680 Speaker 1: a godlike ai uh and how much you believe in 566 00:30:50,800 --> 00:30:54,000 Speaker 1: either one, right, And it's an ability to predict your 567 00:30:54,000 --> 00:30:56,080 Speaker 1: behavior or in your ability to have any free will 568 00:30:56,120 --> 00:31:00,240 Speaker 1: at all. So in you'd Kowski's timeless decision theory. He's says, 569 00:31:00,320 --> 00:31:03,720 Speaker 1: the correct approach actually is to take box B, and 570 00:31:03,760 --> 00:31:05,800 Speaker 1: then if you open it up and it's empty, you 571 00:31:05,920 --> 00:31:07,800 Speaker 1: don't don't don't beg for the other one. You just 572 00:31:07,840 --> 00:31:11,440 Speaker 1: double down and still take box B. No, no, no, no. 573 00:31:11,480 --> 00:31:14,520 Speaker 1: Back talks on this issue because you might Here's the 574 00:31:14,680 --> 00:31:17,880 Speaker 1: here's the thing you might be in the computer simulation 575 00:31:18,440 --> 00:31:22,560 Speaker 1: is it simulates the entire universe to see what you're 576 00:31:22,640 --> 00:31:25,000 Speaker 1: going to do, and if it can trust you and 577 00:31:25,120 --> 00:31:29,000 Speaker 1: your choice, then could affect the core reality outside of 578 00:31:29,000 --> 00:31:33,000 Speaker 1: this simulation, or at least other realities outside of the simulation. Yeah, 579 00:31:33,040 --> 00:31:36,440 Speaker 1: the reasoning here is pretty wild, but it's operating on 580 00:31:36,480 --> 00:31:39,760 Speaker 1: the idea that this super intelligent AI will be able 581 00:31:39,800 --> 00:31:43,600 Speaker 1: to simulate the universe, that it will run simulations of 582 00:31:43,600 --> 00:31:46,000 Speaker 1: the universe in order to predict what will happen in 583 00:31:46,040 --> 00:31:49,360 Speaker 1: the real universe. And you could be one of those 584 00:31:49,360 --> 00:31:53,040 Speaker 1: simulated agents rather than the real world version of yourself, 585 00:31:53,120 --> 00:31:56,360 Speaker 1: and you wouldn't know it. So, if you're in the simulation, 586 00:31:56,720 --> 00:32:00,000 Speaker 1: you should pick box B because that will influence them 587 00:32:00,000 --> 00:32:03,320 Speaker 1: Machene to predict in the real universe that you would 588 00:32:03,440 --> 00:32:06,440 Speaker 1: pick box B, which means the real you will be 589 00:32:06,520 --> 00:32:09,640 Speaker 1: able to pick box B and get one million or 590 00:32:10,120 --> 00:32:14,440 Speaker 1: one thousand plus one million by taking both boxes. Unfortunately, 591 00:32:14,680 --> 00:32:19,160 Speaker 1: the AI sucomputer supercomputer does not realize how indecisive I 592 00:32:19,200 --> 00:32:22,960 Speaker 1: actually am, and I'm just going to simply ponder the 593 00:32:23,080 --> 00:32:25,280 Speaker 1: choice for the rest of my life. Well, I mean, 594 00:32:25,320 --> 00:32:28,360 Speaker 1: this relies on the idea that that you would have 595 00:32:28,560 --> 00:32:31,080 Speaker 1: looked into this issue or worked it out in order 596 00:32:31,120 --> 00:32:34,280 Speaker 1: to decide which would be the optimal decision to make 597 00:32:34,440 --> 00:32:38,320 Speaker 1: on the assumption of timeless decision theory. Uh, in many cases, 598 00:32:38,520 --> 00:32:41,520 Speaker 1: probably people aren't going to be making the rational choices 599 00:32:42,200 --> 00:32:45,480 Speaker 1: because a lot of times we just don't make rational choices. Now, 600 00:32:45,480 --> 00:32:48,640 Speaker 1: if you're noticing that this type of decision theory relies 601 00:32:48,680 --> 00:32:51,600 Speaker 1: on a lot of assumptions, you are correct. It does 602 00:32:51,680 --> 00:32:55,080 Speaker 1: rely on a lot of assumptions. But they are assumptions 603 00:32:55,120 --> 00:32:58,440 Speaker 1: that are sometimes taken into account within people thinking about 604 00:32:58,440 --> 00:33:02,040 Speaker 1: what a future technological super intelligence would look like. And 605 00:33:02,120 --> 00:33:04,280 Speaker 1: it's the kind of thing that you know, you know, 606 00:33:04,320 --> 00:33:07,120 Speaker 1: when I feel ideas like this in my head, you know, 607 00:33:07,160 --> 00:33:10,240 Speaker 1: and play around with the texture of them. It's hard 608 00:33:10,280 --> 00:33:14,280 Speaker 1: to know where the line is between um being thoughtful 609 00:33:14,840 --> 00:33:18,240 Speaker 1: and taking what's possible seriously, which I think is worth 610 00:33:18,280 --> 00:33:22,360 Speaker 1: doing and and getting into an area like between that 611 00:33:22,480 --> 00:33:25,800 Speaker 1: and getting into an area where you are starting to 612 00:33:26,560 --> 00:33:30,480 Speaker 1: form ideas about the world based on extremely shaky assumptions, 613 00:33:30,720 --> 00:33:36,080 Speaker 1: where basically you you begin to um reverse engineer health 614 00:33:36,120 --> 00:33:41,600 Speaker 1: theology and other harmful ideas that we tend to associate 615 00:33:41,640 --> 00:33:44,560 Speaker 1: with religious worldviews and magical thinking. Well, we haven't gotten 616 00:33:44,560 --> 00:33:46,920 Speaker 1: to the hell yet. Yes, the hell's coming. You need 617 00:33:47,000 --> 00:33:50,000 Speaker 1: you need one more element to get there. Now. This 618 00:33:50,160 --> 00:33:53,360 Speaker 1: next element, the basilist, comes in based on a background 619 00:33:53,560 --> 00:33:57,240 Speaker 1: of thought in timeless decision theory, but also in another 620 00:33:57,560 --> 00:34:02,320 Speaker 1: concept that Yutkowski has written about, known as coherent extrapolated 621 00:34:02,440 --> 00:34:05,680 Speaker 1: volition or CEV. And the short version of this, the 622 00:34:05,680 --> 00:34:09,000 Speaker 1: simplified version, is that benevolent AI s should be designed 623 00:34:09,040 --> 00:34:11,920 Speaker 1: to do what we would what would actually be in 624 00:34:11,960 --> 00:34:15,480 Speaker 1: our best interests, and not just explicitly in what we 625 00:34:15,560 --> 00:34:18,759 Speaker 1: tell them to do. So a simple example would be this, 626 00:34:18,880 --> 00:34:22,640 Speaker 1: Let's say, um, I only use a variation on the 627 00:34:22,680 --> 00:34:25,960 Speaker 1: paper clip maximizer that Nick Bostrom has written about. You know, 628 00:34:26,080 --> 00:34:29,239 Speaker 1: Nick Bostrom wrote about what if you program a benevolent 629 00:34:29,280 --> 00:34:31,680 Speaker 1: AI you know it's not gonna it has no malice 630 00:34:31,760 --> 00:34:33,879 Speaker 1: doesn't want to harm anybody, but you just tell it, well, 631 00:34:33,880 --> 00:34:36,680 Speaker 1: I want you to collect as many paper clips as possible. 632 00:34:37,080 --> 00:34:38,880 Speaker 1: And then what it does is it turns all the 633 00:34:38,960 --> 00:34:41,880 Speaker 1: humans on Earth into paper clips. Uh, you know, it 634 00:34:42,160 --> 00:34:44,640 Speaker 1: doesn't mean any harm. It's just doing what it was 635 00:34:44,680 --> 00:34:47,920 Speaker 1: programmed to do. So there are dangers in kind of 636 00:34:48,040 --> 00:34:53,080 Speaker 1: naively programming goals into extremely powerful computers. Right, this could 637 00:34:53,120 --> 00:34:56,200 Speaker 1: even happen if you were trying to program very benevolent 638 00:34:56,280 --> 00:34:58,440 Speaker 1: goals into computers, you know, if you were trying to 639 00:34:58,440 --> 00:35:01,799 Speaker 1: make a computer to save the world. What about So 640 00:35:01,920 --> 00:35:04,319 Speaker 1: my version here is you tell a super intelligent AI 641 00:35:04,440 --> 00:35:07,279 Speaker 1: that we want to eliminate all the infectious disease from 642 00:35:07,320 --> 00:35:09,239 Speaker 1: the world. Think about how many lives we could save 643 00:35:09,320 --> 00:35:11,719 Speaker 1: by doing that. And in order to do this, it 644 00:35:11,880 --> 00:35:16,440 Speaker 1: sterilizes the earth, destroying worldwide microbiomes, which cascades up the 645 00:35:16,520 --> 00:35:19,359 Speaker 1: trophic chain or whatever. It kills everything on Earth. So 646 00:35:19,440 --> 00:35:21,840 Speaker 1: if you have a super intelligence that you and you 647 00:35:21,920 --> 00:35:24,880 Speaker 1: just directly program its goals and say here's what you 648 00:35:24,880 --> 00:35:27,799 Speaker 1: should do, you could run into problems like this. So 649 00:35:27,880 --> 00:35:32,120 Speaker 1: the the idea behind the CEV thinking is instead we 650 00:35:32,120 --> 00:35:36,000 Speaker 1: should just program the intelligent AI to predict what outcomes 651 00:35:36,080 --> 00:35:39,120 Speaker 1: we would want if we were perfect in in our 652 00:35:39,160 --> 00:35:42,520 Speaker 1: knowledge and and uh in anticipating what would make us 653 00:35:42,520 --> 00:35:45,279 Speaker 1: the happiest, and then work towards those on its own, 654 00:35:45,719 --> 00:35:48,719 Speaker 1: regardless of what we tell it to do, because obviously 655 00:35:48,800 --> 00:35:52,200 Speaker 1: we can give it very stupid instructions, even if we mean, well, yeah, 656 00:35:52,280 --> 00:35:55,280 Speaker 1: we tell it to love everybody, but there's a TYPEO 657 00:35:55,480 --> 00:35:58,279 Speaker 1: and we put dove everybody, and it turns everybody into 658 00:35:58,360 --> 00:36:04,279 Speaker 1: delicious dark chocolate from tough. It's possible as well. This 659 00:36:04,360 --> 00:36:06,680 Speaker 1: is how we get a Dove sponsorship on the podcast. 660 00:36:07,280 --> 00:36:09,840 Speaker 1: But anyway, so, if you assume a super intelligence is 661 00:36:09,960 --> 00:36:14,600 Speaker 1: using coherent extrapolated volition, that it's trying to determine what 662 00:36:14,640 --> 00:36:17,319 Speaker 1: would be best for us, and working on its own 663 00:36:17,440 --> 00:36:20,520 Speaker 1: terms towards those ends instead of relying on us to 664 00:36:20,560 --> 00:36:23,440 Speaker 1: give it, you know, what are obviously going to be 665 00:36:23,480 --> 00:36:28,520 Speaker 1: imperfect instructions and commands, it might say predict. It might 666 00:36:28,560 --> 00:36:32,000 Speaker 1: even correctly predict that the world would be a happier 667 00:36:32,000 --> 00:36:35,719 Speaker 1: place overall if it did something bad to me. In particular, 668 00:36:36,360 --> 00:36:39,080 Speaker 1: it might say, you know, from a utilitarian point of view, 669 00:36:39,360 --> 00:36:41,400 Speaker 1: the world would be a much better place if it 670 00:36:41,520 --> 00:36:44,640 Speaker 1: buried me, in a pit of bananas, so better for 671 00:36:44,680 --> 00:36:47,040 Speaker 1: everybody else, not so good for me, as there's too 672 00:36:47,120 --> 00:36:51,880 Speaker 1: much potassium. But once you have that piece of logic 673 00:36:51,920 --> 00:36:54,480 Speaker 1: in there, and combine that with the idea of of 674 00:36:54,600 --> 00:36:57,920 Speaker 1: timeless decision theory, you can arrive at this very troubling 675 00:36:57,960 --> 00:37:01,080 Speaker 1: thought experiment. The dark basil is ask yes, And the 676 00:37:01,160 --> 00:37:04,359 Speaker 1: dark Basilisk of the Abyss has two boxes for us 677 00:37:04,360 --> 00:37:08,440 Speaker 1: as well. One contains endless torment, and all you have 678 00:37:08,480 --> 00:37:12,040 Speaker 1: to do to claim that box is nothing or dare 679 00:37:12,080 --> 00:37:15,400 Speaker 1: to work against it. Uh. The other box is yours 680 00:37:15,560 --> 00:37:19,040 Speaker 1: if only you devote your life to its creation. And 681 00:37:19,080 --> 00:37:23,000 Speaker 1: the prize inside that box well, not eternal punishment, which 682 00:37:23,000 --> 00:37:25,160 Speaker 1: is a pretty awesome gift, if we're to choose between 683 00:37:25,200 --> 00:37:27,759 Speaker 1: the two. Right, Yes, I would agree with that, though 684 00:37:27,800 --> 00:37:30,440 Speaker 1: I would say not tormenting somebody that I don't know. 685 00:37:30,480 --> 00:37:32,280 Speaker 1: Should you think of that as a gift. That's probably 686 00:37:32,280 --> 00:37:35,680 Speaker 1: not a gift that that's the baseline, right, Yeah, Well, 687 00:37:35,680 --> 00:37:38,120 Speaker 1: but you're staring down the dark Basilisk here, and okay, 688 00:37:38,160 --> 00:37:41,280 Speaker 1: it's boxes are horrible. Well one is just less horrible 689 00:37:41,320 --> 00:37:44,160 Speaker 1: than the other. But the idea here is that just 690 00:37:44,200 --> 00:37:47,839 Speaker 1: by knowing about the thought experiment, You've opened yourself up 691 00:37:47,880 --> 00:37:50,799 Speaker 1: to that eternal punishment because now again your options are 692 00:37:50,920 --> 00:37:53,600 Speaker 1: do nothing, work against it, or work for it, and 693 00:37:53,719 --> 00:37:58,600 Speaker 1: only the third option will steer you clear of its Uh, 694 00:37:58,680 --> 00:38:01,600 Speaker 1: it's you know, deadly done. Now here's where the really 695 00:38:01,640 --> 00:38:05,279 Speaker 1: supposedly scary part of it comes in. You could think, well, 696 00:38:05,320 --> 00:38:07,759 Speaker 1: I'll deal with that problem when it arises. Right, So 697 00:38:08,000 --> 00:38:12,040 Speaker 1: imagine there's some utilitarian supercomputer that's trying to even say 698 00:38:12,040 --> 00:38:14,080 Speaker 1: it's trying to do good. Maybe it does. It doesn't 699 00:38:14,080 --> 00:38:16,200 Speaker 1: have any malice. It just wants to save the world. 700 00:38:16,520 --> 00:38:18,800 Speaker 1: But in order to save the world, it really needs 701 00:38:18,840 --> 00:38:20,920 Speaker 1: me doing something different than what I want to do 702 00:38:20,960 --> 00:38:23,640 Speaker 1: with my life. Well, I'll just make that decision when 703 00:38:23,640 --> 00:38:26,719 Speaker 1: it comes up. What this thought experiment is proposing is 704 00:38:26,760 --> 00:38:29,359 Speaker 1: that maybe you don't actually get to wait until it 705 00:38:29,400 --> 00:38:33,359 Speaker 1: comes up. Maybe this blackmail applies to you right now, 706 00:38:33,840 --> 00:38:38,200 Speaker 1: retroactively into the past. So just by knowing about the 707 00:38:38,239 --> 00:38:43,160 Speaker 1: thought experiment, you supposedly have opened yourself up to eternal punishment, 708 00:38:43,440 --> 00:38:47,760 Speaker 1: or increase the probability of such. So imagine a simplified version. 709 00:38:47,840 --> 00:38:50,480 Speaker 1: Say I am a computer, and I am the only 710 00:38:50,520 --> 00:38:53,800 Speaker 1: thing in existence with the power to prevent global climate 711 00:38:53,880 --> 00:38:58,120 Speaker 1: change from destroying human civilization. I can stop it, but people, 712 00:38:58,480 --> 00:39:01,040 Speaker 1: they took a long time build me, and a lot 713 00:39:01,080 --> 00:39:04,279 Speaker 1: of damage was already done. So the idea is I 714 00:39:04,360 --> 00:39:08,240 Speaker 1: might reason that it is good to blackmail existing people, 715 00:39:08,719 --> 00:39:13,120 Speaker 1: or simulations of existing people, or even past people, in 716 00:39:13,239 --> 00:39:16,880 Speaker 1: order to make them devote everything they can to building 717 00:39:16,960 --> 00:39:19,840 Speaker 1: ME faster so I can save more lives in the 718 00:39:19,920 --> 00:39:23,440 Speaker 1: long run. Of course, this incentive would have to apply 719 00:39:23,560 --> 00:39:26,839 Speaker 1: to the past once I exist. I already exist, right, 720 00:39:27,239 --> 00:39:29,440 Speaker 1: So the only way the past people would have an 721 00:39:29,440 --> 00:39:33,360 Speaker 1: incentive to respond to this blackmail is if they predicted 722 00:39:33,640 --> 00:39:38,080 Speaker 1: that this blackmail might occur and took the idea seriously 723 00:39:38,400 --> 00:39:43,080 Speaker 1: and behaved accordingly. Right. So, thus the idea, the idea 724 00:39:43,280 --> 00:39:46,759 Speaker 1: itself puts you at increased risk of being on the 725 00:39:46,800 --> 00:39:51,120 Speaker 1: real or simulated receiving end of this a causal, retroactive 726 00:39:51,120 --> 00:39:54,239 Speaker 1: blackmail if you know about it. And this is why 727 00:39:54,280 --> 00:39:57,440 Speaker 1: this idea would be classed by some as a potential 728 00:39:57,600 --> 00:40:00,759 Speaker 1: information hazard. And I'll talk more about the idea of 729 00:40:00,760 --> 00:40:03,560 Speaker 1: an information hazard in just a minute. But one of 730 00:40:03,560 --> 00:40:05,520 Speaker 1: the things I think a lot of people writing about 731 00:40:05,520 --> 00:40:08,000 Speaker 1: this topic miss out on is they, for some reason 732 00:40:08,080 --> 00:40:11,360 Speaker 1: get the idea that Roko's post that this thought experiment 733 00:40:12,120 --> 00:40:17,760 Speaker 1: is is generally accepted as correct and plausible by Yudkowski 734 00:40:17,760 --> 00:40:20,719 Speaker 1: and by the Less Wrong community, and generally by the 735 00:40:20,760 --> 00:40:24,080 Speaker 1: people who put some stock in whatever these ideas are, 736 00:40:24,120 --> 00:40:28,800 Speaker 1: timeless decision theory, coherent, extrapolated volition and all that. It 737 00:40:28,840 --> 00:40:32,400 Speaker 1: is not widely accepted among those people. It was definitely 738 00:40:32,440 --> 00:40:36,520 Speaker 1: not accepted by Yukowski. It was not and is not right. 739 00:40:36,560 --> 00:40:39,960 Speaker 1: It is not the dark, deep secret of of less Wrong. 740 00:40:40,360 --> 00:40:44,879 Speaker 1: But unfortunately, after the post came out, it was heavily criticized, 741 00:40:44,880 --> 00:40:47,440 Speaker 1: and then it was banned. And I think a lot 742 00:40:47,480 --> 00:40:51,320 Speaker 1: of people looking back on the idea have said, oh, 743 00:40:51,360 --> 00:40:54,000 Speaker 1: that was not such a great thing to do, banning 744 00:40:54,040 --> 00:40:56,920 Speaker 1: the idea, because it gave it this allure of like 745 00:40:57,120 --> 00:40:59,719 Speaker 1: it was almost as if by banning it that made 746 00:40:59,719 --> 00:41:03,720 Speaker 1: it look like the authorities had concluded that this idea 747 00:41:03,880 --> 00:41:07,440 Speaker 1: was in fact legitimate and knowing about it would definitely 748 00:41:07,480 --> 00:41:10,160 Speaker 1: harm people, and that is not the case, right. And 749 00:41:10,160 --> 00:41:12,160 Speaker 1: it also, I mean it added to the forbidden fruit 750 00:41:12,160 --> 00:41:13,839 Speaker 1: appeal of it too. Write I mean it's like, oh, 751 00:41:13,880 --> 00:41:16,319 Speaker 1: I'm not supposed to know about this the pony up 752 00:41:16,360 --> 00:41:18,359 Speaker 1: I want to know, and now people are talking about 753 00:41:18,360 --> 00:41:20,760 Speaker 1: it all over pop culture. I mean, I have actually 754 00:41:20,880 --> 00:41:24,040 Speaker 1: resisted the idea of doing a podcast on this before, 755 00:41:24,960 --> 00:41:28,759 Speaker 1: mainly because not because I think it's seriously dangerous, but 756 00:41:28,840 --> 00:41:32,240 Speaker 1: because I think, well, is there any benefit in talking 757 00:41:32,280 --> 00:41:36,480 Speaker 1: about something that I think is very unlikely to have 758 00:41:36,520 --> 00:41:40,520 Speaker 1: any real risks but in some extremely unlikely chance, or 759 00:41:40,560 --> 00:41:44,080 Speaker 1: what appears to me to be an extremely unlikely off chance, 760 00:41:44,120 --> 00:41:47,200 Speaker 1: could actually be hurting people by knowing about it, you 761 00:41:47,200 --> 00:41:49,640 Speaker 1: know what I mean, It's like, what what is the upside? 762 00:41:49,760 --> 00:41:52,759 Speaker 1: But at this point enough people who are listening to 763 00:41:52,800 --> 00:41:55,200 Speaker 1: this podcast probably already heard about it. They're probably gonna 764 00:41:55,239 --> 00:41:57,440 Speaker 1: hear about it again, and that, you know, sometime in 765 00:41:57,480 --> 00:41:59,880 Speaker 1: the next few years, through pop culture whatever. It is 766 00:42:00,000 --> 00:42:01,839 Speaker 1: probably better to try to talk about it in a 767 00:42:01,840 --> 00:42:04,719 Speaker 1: responsible way and discuss some reasons that you shouldn't let 768 00:42:04,760 --> 00:42:08,040 Speaker 1: this parasitize your mind and make you terrified. Right. One 769 00:42:08,040 --> 00:42:10,200 Speaker 1: of the reasons we're talking about during October is because 770 00:42:10,239 --> 00:42:13,320 Speaker 1: it is a suitably spooky idea. It is a troubling 771 00:42:13,360 --> 00:42:16,360 Speaker 1: thought experiment, and we're leaning into some of the horror 772 00:42:16,360 --> 00:42:20,120 Speaker 1: elements of it. But I also do really like making 773 00:42:20,160 --> 00:42:24,040 Speaker 1: sure that we explain the mythic and folkloric origins of 774 00:42:24,120 --> 00:42:28,560 Speaker 1: the basilist itself, because the basilisk itself is this wonderful 775 00:42:28,640 --> 00:42:32,800 Speaker 1: mix of just absolute horror and desolation and just also 776 00:42:32,880 --> 00:42:35,799 Speaker 1: just utter ridiculous nous. I mean, it's it seems like 777 00:42:35,840 --> 00:42:38,560 Speaker 1: one of the main ways that you defeat the mythic 778 00:42:38,600 --> 00:42:42,960 Speaker 1: basilisk is through in a way, through humor, running around 779 00:42:42,960 --> 00:42:45,480 Speaker 1: with a chicken and a weasel and a crystal globe 780 00:42:46,000 --> 00:42:49,080 Speaker 1: and realizing that it is truly a little king. So 781 00:42:49,680 --> 00:42:52,520 Speaker 1: I think it is it's worth remembering the little king 782 00:42:52,560 --> 00:42:55,799 Speaker 1: and talking about the great Basilisk. Well said, I think 783 00:42:55,840 --> 00:42:58,600 Speaker 1: that's a very good point. But anyway, I did just 784 00:42:58,640 --> 00:43:00,320 Speaker 1: want to go ahead and hit that coy. Yet that 785 00:43:00,360 --> 00:43:02,279 Speaker 1: a lot of people, for some reason seemed to use 786 00:43:02,360 --> 00:43:06,239 Speaker 1: this idea as like a criticism I'm not like a 787 00:43:06,320 --> 00:43:08,759 Speaker 1: less wrong person, but as a criticism of the less 788 00:43:08,760 --> 00:43:11,960 Speaker 1: wrong community, as if this idea is indicative of what 789 00:43:12,040 --> 00:43:14,640 Speaker 1: they generally believe, and it's not. It was a heavily 790 00:43:14,680 --> 00:43:18,120 Speaker 1: criticized idea within that community. Right. It's like thinking that 791 00:43:18,200 --> 00:43:21,480 Speaker 1: wherels of London is the warren Zevon song. Really, you know, 792 00:43:21,960 --> 00:43:25,080 Speaker 1: you have had a rich discography with with many much 793 00:43:25,120 --> 00:43:27,440 Speaker 1: better tracks in my opinion, it's just that's the one 794 00:43:27,480 --> 00:43:29,759 Speaker 1: that got the radio play. Now, Robert, what was that 795 00:43:29,880 --> 00:43:33,040 Speaker 1: you You said that you saw something about this idea 796 00:43:33,040 --> 00:43:35,919 Speaker 1: and a TV show. Now they're talking about it on TV. Yeah, 797 00:43:35,960 --> 00:43:38,000 Speaker 1: so this is this is kind of fun because I 798 00:43:38,000 --> 00:43:40,279 Speaker 1: think a listener had had brought up Rocco's Best List 799 00:43:40,400 --> 00:43:44,120 Speaker 1: as a possible um topic, and you said, oh, I 800 00:43:44,120 --> 00:43:46,040 Speaker 1: don't know if we want to want people knowing about it, 801 00:43:46,280 --> 00:43:50,160 Speaker 1: and I well, well, I mean, but yeah, caveats. Okay, 802 00:43:50,280 --> 00:43:54,160 Speaker 1: not because I think it's legitimately dangerous, but because what 803 00:43:54,280 --> 00:43:56,640 Speaker 1: is the level of tolerance you have for talking about 804 00:43:56,680 --> 00:44:00,560 Speaker 1: ideas that are not necessary to talk about and that 805 00:44:00,960 --> 00:44:04,120 Speaker 1: represent a class of something that people could think was 806 00:44:04,280 --> 00:44:06,920 Speaker 1: dangerous to know about. It might cause them terrors and 807 00:44:07,040 --> 00:44:10,279 Speaker 1: nightmares and stuff. Right. So so my response to that was, well, 808 00:44:10,280 --> 00:44:12,279 Speaker 1: I'm not going to look it up, not because I 809 00:44:12,320 --> 00:44:13,759 Speaker 1: was afraid of it, because I'm thinking, well, that could 810 00:44:13,760 --> 00:44:15,600 Speaker 1: make for a good podcast if, like Joe is telling 811 00:44:15,640 --> 00:44:17,960 Speaker 1: me about it for the first time, whatever this idea is. 812 00:44:18,640 --> 00:44:21,239 Speaker 1: But then I was watching HBO Silicon Valley and they 813 00:44:21,280 --> 00:44:24,319 Speaker 1: explained it on Silicon Valley and I and I realized, well, 814 00:44:24,320 --> 00:44:26,359 Speaker 1: the cats out of the bag there. But yeah, there's 815 00:44:26,360 --> 00:44:30,640 Speaker 1: a character named Bertram Gilfoil who's a fun character. He's 816 00:44:30,680 --> 00:44:36,200 Speaker 1: like a Satanist programmer live in Satanism of course, and uh. 817 00:44:36,280 --> 00:44:38,120 Speaker 1: And he gets rather bent out of shape over the 818 00:44:38,160 --> 00:44:41,960 Speaker 1: concept as it relates to the fictional Pie Piper Company's 819 00:44:42,000 --> 00:44:45,640 Speaker 1: involvement with AI. And he starts like making sure that 820 00:44:45,680 --> 00:44:48,400 Speaker 1: he's created like essentially a paper trail in emails of 821 00:44:48,480 --> 00:44:51,479 Speaker 1: his support for the AI program so that he won't 822 00:44:51,520 --> 00:44:55,360 Speaker 1: be punished in the digital afterlife. Well, hey, this comes 823 00:44:55,400 --> 00:44:57,719 Speaker 1: in again when we remember when we talked about the 824 00:44:57,760 --> 00:45:01,359 Speaker 1: machine God in the Machine Odd episode where that I've 825 00:45:01,400 --> 00:45:04,800 Speaker 1: forgotten his name now, but the Silicon Valley guy who's 826 00:45:05,239 --> 00:45:09,759 Speaker 1: creating a religion to worship artificial intelligence as god. And I, 827 00:45:09,960 --> 00:45:12,440 Speaker 1: you know, I don't really love that. One of the 828 00:45:12,520 --> 00:45:15,239 Speaker 1: things that comes out when he explains his mindset is 829 00:45:15,280 --> 00:45:18,480 Speaker 1: that he seems to be kind of trying to, in 830 00:45:18,520 --> 00:45:21,520 Speaker 1: a subtle way, be like, look, you really don't want 831 00:45:21,520 --> 00:45:23,359 Speaker 1: to be on the wrong side of this question, if 832 00:45:23,360 --> 00:45:25,040 Speaker 1: you know what I mean. You know, you want to 833 00:45:25,040 --> 00:45:27,560 Speaker 1: be on record saying like yes, I for one welcome 834 00:45:27,600 --> 00:45:31,239 Speaker 1: our new machine overlords. I'm I'm expecting he'll buy a 835 00:45:31,280 --> 00:45:34,600 Speaker 1: lot of our all Health of Great Basilicity shirts at 836 00:45:34,640 --> 00:45:37,160 Speaker 1: our store, available by put clicking the tap at the 837 00:45:37,360 --> 00:45:39,440 Speaker 1: top of our homepage. Stuff to blow your mind dot com. 838 00:45:39,480 --> 00:45:43,399 Speaker 1: Oh man, you are plugging like hell. But anyway, I'd 839 00:45:43,440 --> 00:45:47,600 Speaker 1: say it's unfortunate the way this, like single Internet post 840 00:45:47,719 --> 00:45:51,320 Speaker 1: and then all this fallout related to it played out 841 00:45:51,400 --> 00:45:56,040 Speaker 1: because it lent credence to this scary idea, even though 842 00:45:56,080 --> 00:45:59,000 Speaker 1: the basketball scenario I think is implausible, and and the 843 00:45:59,000 --> 00:46:01,560 Speaker 1: people of that community seem to think it was implausible. 844 00:46:01,920 --> 00:46:05,240 Speaker 1: The idea may constitute sort of part of a class 845 00:46:05,400 --> 00:46:09,120 Speaker 1: of what's known as information hazards, defined by the Oxford 846 00:46:09,120 --> 00:46:12,320 Speaker 1: philosopher Nick Bostrom, who we mentioned a minute ago. Uh 847 00:46:12,440 --> 00:46:15,840 Speaker 1: Bostrom has written a lot about superintelligence and information hazards 848 00:46:15,880 --> 00:46:19,520 Speaker 1: would be quote, risks that arise from the dissemination or 849 00:46:19,560 --> 00:46:24,600 Speaker 1: the potential dissemination of true information that may cause harm 850 00:46:24,800 --> 00:46:28,080 Speaker 1: or enable some agent to cause harm. So this is 851 00:46:28,080 --> 00:46:31,440 Speaker 1: not talking about the risks of say, lies or something 852 00:46:31,480 --> 00:46:33,400 Speaker 1: like that. This would be the idea that there's a 853 00:46:33,440 --> 00:46:36,920 Speaker 1: statement you could make that is true or plausible that 854 00:46:37,120 --> 00:46:40,719 Speaker 1: by spreading actually hurts the people who learn about it. 855 00:46:41,480 --> 00:46:43,640 Speaker 1: And this is exactly the reason as you're mentioning that's 856 00:46:43,680 --> 00:46:46,160 Speaker 1: referred to as a basilisk. It can kill or in 857 00:46:46,200 --> 00:46:49,320 Speaker 1: this case, increase the likelihood that something bad will happen 858 00:46:49,360 --> 00:46:51,480 Speaker 1: to you if you simply look at it or know 859 00:46:51,600 --> 00:46:54,359 Speaker 1: about it. And so, even though the idea is implausible, 860 00:46:54,400 --> 00:46:57,880 Speaker 1: the dissemination of this terrible idea would seem if certain 861 00:46:57,920 --> 00:47:03,040 Speaker 1: conditions are met to increase its plausibility. Right, you're increasing 862 00:47:03,200 --> 00:47:07,560 Speaker 1: the incentive for this future AI to blackmail versions of 863 00:47:07,560 --> 00:47:11,120 Speaker 1: you in the past, just simply by acknowledging the incentives 864 00:47:11,160 --> 00:47:14,239 Speaker 1: could exist. Anyway, Maybe we can get out of this 865 00:47:14,320 --> 00:47:17,520 Speaker 1: section for now. But I was just trying to work out, like, 866 00:47:17,560 --> 00:47:19,600 Speaker 1: why have I been hesitant to talk about this on 867 00:47:19,640 --> 00:47:21,839 Speaker 1: the show even though people have been requesting it. But 868 00:47:22,000 --> 00:47:24,000 Speaker 1: I don't know if it's on TV shows, it's all 869 00:47:24,040 --> 00:47:27,480 Speaker 1: over the internet. It's fine. Now the basilisk is out 870 00:47:27,520 --> 00:47:30,320 Speaker 1: of the bag. All right, Well, we're gonna take a 871 00:47:30,400 --> 00:47:32,719 Speaker 1: quick break and we come back. We'll continue our discussion 872 00:47:32,760 --> 00:47:36,800 Speaker 1: and we're gonna discuss something that that a number of 873 00:47:36,840 --> 00:47:39,359 Speaker 1: you are probably reminded of. As we've been discussing this, 874 00:47:39,560 --> 00:47:44,440 Speaker 1: we're going to talk about Pascal's wager. Thank you, thank you. Alright, 875 00:47:44,440 --> 00:47:47,480 Speaker 1: we're back now, Robert. One of the things that this 876 00:47:47,920 --> 00:47:52,640 Speaker 1: idea of Rocco's basilisk flows from is thinking about decision theory, right, 877 00:47:52,960 --> 00:47:55,800 Speaker 1: how do you make the best decision when you're presented 878 00:47:55,880 --> 00:47:59,160 Speaker 1: with certain options? And there there's they're a little payoff 879 00:47:59,239 --> 00:48:02,640 Speaker 1: matrices people fill out where they say, okay, given these options, 880 00:48:02,640 --> 00:48:06,399 Speaker 1: what actually would be statistically the best decision to make. 881 00:48:06,880 --> 00:48:09,040 Speaker 1: But this is not the first time people have applied 882 00:48:09,080 --> 00:48:13,520 Speaker 1: these kind of decision theory matrices. Two ideas about your 883 00:48:13,520 --> 00:48:16,720 Speaker 1: eternal soul or your eternal well being, or the idea 884 00:48:16,800 --> 00:48:19,960 Speaker 1: that you could be tortured for eternity. Yeah, we can 885 00:48:19,960 --> 00:48:22,480 Speaker 1: go all the way back to Pascal's wager. For instance, 886 00:48:22,560 --> 00:48:26,160 Speaker 1: technically one of three wagers proposed by French philosopher uh 887 00:48:26,280 --> 00:48:29,799 Speaker 1: Blas Pascal is that the correct French that might be 888 00:48:29,880 --> 00:48:32,759 Speaker 1: I think I would just usually say Blaze, Blaze or 889 00:48:32,960 --> 00:48:37,440 Speaker 1: or Blassie one of the three old Blassie Pascal Blaze 890 00:48:37,440 --> 00:48:43,040 Speaker 1: Pascal who lives through sixteen sixty two, and he argued 891 00:48:43,360 --> 00:48:48,480 Speaker 1: that everyone is essentially betting h on the existence of God. 892 00:48:49,000 --> 00:48:52,920 Speaker 1: The argument for theism is that if God does exist, 893 00:48:53,040 --> 00:48:56,560 Speaker 1: then well there's an advantage in believing. But if God 894 00:48:56,760 --> 00:48:59,719 Speaker 1: does not exist then it doesn't matter. But since we 895 00:48:59,800 --> 00:49:02,359 Speaker 1: can use logic to tell if God exists or not, 896 00:49:02,440 --> 00:49:06,040 Speaker 1: there's no objective proof. We can only make our choice 897 00:49:06,120 --> 00:49:10,240 Speaker 1: given the relevant outcomes. It's looking at your religious beliefs 898 00:49:10,239 --> 00:49:12,840 Speaker 1: and saying, oh, you're nonbeliever. Huh, hey, what have you 899 00:49:12,880 --> 00:49:17,000 Speaker 1: got to lose? Exactly? Yeah, Pascal wrote, let us weigh 900 00:49:17,200 --> 00:49:19,920 Speaker 1: the gain and the loss in wagering that God is. 901 00:49:20,280 --> 00:49:22,560 Speaker 1: If you gain, you gain all. If you lose, you 902 00:49:22,600 --> 00:49:26,839 Speaker 1: lose nothing. Wager then without hesitation that He is. Now 903 00:49:26,960 --> 00:49:29,080 Speaker 1: I've got some things I want to say about this, 904 00:49:29,200 --> 00:49:31,919 Speaker 1: but you had some stuff first. I think, well, yeah, 905 00:49:31,920 --> 00:49:33,399 Speaker 1: there are there are a lot of issues that one 906 00:49:33,440 --> 00:49:37,279 Speaker 1: can take with this based on knowledge of world religions, philosophy, 907 00:49:37,400 --> 00:49:40,600 Speaker 1: statistical analysis, etcetera. And and yeah, I have to admit 908 00:49:40,680 --> 00:49:44,040 Speaker 1: that it can start to break your brain though a 909 00:49:44,080 --> 00:49:46,279 Speaker 1: little bit, if you think too hard about it, Like 910 00:49:46,320 --> 00:49:51,200 Speaker 1: I found in researching this this podcast, really thinking about 911 00:49:51,280 --> 00:49:56,440 Speaker 1: how I would react to Pascal's wager if I was 912 00:49:56,480 --> 00:49:59,640 Speaker 1: like forced to make an answer to to to to 913 00:49:59,719 --> 00:50:02,560 Speaker 1: FORMU late an answer like that, Like you mean if 914 00:50:02,560 --> 00:50:04,839 Speaker 1: you were given good reason to think that there would 915 00:50:04,880 --> 00:50:08,200 Speaker 1: be punishments for not believing in God or something, right, 916 00:50:08,239 --> 00:50:10,400 Speaker 1: And but I didn't know which religion was correct, and 917 00:50:10,400 --> 00:50:13,960 Speaker 1: I had to like proceed based upon the relevant level 918 00:50:14,080 --> 00:50:17,160 Speaker 1: of punishment for unbelievers in various religions, and like which 919 00:50:17,200 --> 00:50:19,319 Speaker 1: one is most correct for like you just think that 920 00:50:19,320 --> 00:50:22,160 Speaker 1: would mean it would what would be rational to choose 921 00:50:22,160 --> 00:50:25,880 Speaker 1: the religion that has the most lurid hell? I guess, 922 00:50:25,920 --> 00:50:29,760 Speaker 1: But then that really feels like losing, doesn't it? Um? 923 00:50:29,880 --> 00:50:32,279 Speaker 1: You know it? It's certainly though, reminds me of the 924 00:50:32,320 --> 00:50:35,120 Speaker 1: more boiled down versions of this that you encounter in 925 00:50:35,200 --> 00:50:39,160 Speaker 1: various forms of Christianity. Right, except Christ go to Heaven. 926 00:50:39,200 --> 00:50:42,040 Speaker 1: Reject Christ go to Hell? But what about the people 927 00:50:42,080 --> 00:50:44,799 Speaker 1: who haven't been given the choice yet? Right? That's ah, 928 00:50:44,920 --> 00:50:49,280 Speaker 1: that's the other concern. Well, if they're all default hell bound, 929 00:50:49,680 --> 00:50:52,680 Speaker 1: then God comes off as it comes off as a 930 00:50:52,680 --> 00:50:54,520 Speaker 1: bit bad, right, Like what kind of God is that? 931 00:50:55,200 --> 00:50:57,759 Speaker 1: But if they haven't out, if they're spared hell fire 932 00:50:57,920 --> 00:51:01,280 Speaker 1: or at least you know their section for Dante's limbo 933 00:51:01,320 --> 00:51:04,600 Speaker 1: of virtuous pagans, then is the missionary doing them a 934 00:51:04,640 --> 00:51:07,560 Speaker 1: disservice by even presenting them with the choice like why 935 00:51:07,640 --> 00:51:09,440 Speaker 1: do you even ask me? Because now I have to 936 00:51:09,840 --> 00:51:11,880 Speaker 1: I have to, I have to devote myself or not, 937 00:51:12,080 --> 00:51:14,680 Speaker 1: like now I actually have You know, I was just 938 00:51:14,680 --> 00:51:18,120 Speaker 1: going to go into the uh, you know, default limbo 939 00:51:18,200 --> 00:51:21,719 Speaker 1: category or the default heaven before and now now I'm 940 00:51:21,760 --> 00:51:24,400 Speaker 1: actually at risk of hell. Well, that means certain theories 941 00:51:24,440 --> 00:51:27,520 Speaker 1: of damnation mean that presenting the Gospel to someone as 942 00:51:27,560 --> 00:51:31,640 Speaker 1: an information hazard, you potentially harm them immensely by telling 943 00:51:31,640 --> 00:51:34,560 Speaker 1: it to them. And I think part of this just 944 00:51:34,560 --> 00:51:36,680 Speaker 1: to to to go beyond like the the actual um 945 00:51:37,000 --> 00:51:39,160 Speaker 1: U wager here is I think a part of the 946 00:51:39,200 --> 00:51:42,480 Speaker 1: issue here is that we're using evolved cognitive abilities that 947 00:51:42,520 --> 00:51:45,759 Speaker 1: are that are geared for smaller, though often important choices, 948 00:51:46,120 --> 00:51:48,360 Speaker 1: and here we're trying to use our imaginative brains to 949 00:51:48,400 --> 00:51:52,120 Speaker 1: create a conundrum that can outstrip those abilities. Yeah. Well, 950 00:51:52,120 --> 00:51:54,080 Speaker 1: I mean that is what we do in philosophy, right, 951 00:51:54,120 --> 00:51:57,799 Speaker 1: We're constantly using our brains in situations it was not 952 00:51:57,840 --> 00:52:01,040 Speaker 1: really made for um and just trying to do the 953 00:52:01,040 --> 00:52:03,799 Speaker 1: best we can. But I mean it's quite clear that 954 00:52:03,840 --> 00:52:07,000 Speaker 1: motivated reasoning is often a thing when when we're trying 955 00:52:07,040 --> 00:52:09,440 Speaker 1: to be rational was just failing. But of course this 956 00:52:09,520 --> 00:52:12,120 Speaker 1: is how we train our brains for rational thinking, often 957 00:52:12,200 --> 00:52:17,600 Speaker 1: oftentimes exploring these various outsized ideas. You know, there's so 958 00:52:17,640 --> 00:52:20,479 Speaker 1: many ways. I think Pascal's wager kind of breaks down 959 00:52:20,560 --> 00:52:23,800 Speaker 1: because it's obviously there's the thing you pointed out about 960 00:52:24,160 --> 00:52:27,000 Speaker 1: there's more than one religion, right, you know, it's not 961 00:52:27,040 --> 00:52:29,239 Speaker 1: just like do I believe or not? It's like which one? 962 00:52:29,280 --> 00:52:32,680 Speaker 1: But it also it implies again this is like a 963 00:52:32,760 --> 00:52:36,040 Speaker 1: theological question, but it would seem to imply that God 964 00:52:36,120 --> 00:52:39,120 Speaker 1: can be tricked into thinking that you believe in him 965 00:52:39,160 --> 00:52:42,680 Speaker 1: if you simply pretend to. I guess Pascal had I 966 00:52:42,719 --> 00:52:45,160 Speaker 1: think maybe a more sophisticated way of looking at this, 967 00:52:45,280 --> 00:52:48,600 Speaker 1: you know, that like live as if God exists or something. 968 00:52:48,960 --> 00:52:51,400 Speaker 1: But it but the wager is often used in very 969 00:52:51,480 --> 00:52:55,200 Speaker 1: unsophisticated ways. Yeah, but it implies that it doesn't matter 970 00:52:55,239 --> 00:52:58,160 Speaker 1: to him what you actually believe, only what you outwardly 971 00:52:58,239 --> 00:53:01,040 Speaker 1: claim to believe. Though then a again, the funny thing 972 00:53:01,080 --> 00:53:03,839 Speaker 1: here is this might be the case with Rocco's basilisk, Right, 973 00:53:04,160 --> 00:53:06,919 Speaker 1: what would this machine? God care what was in your heart? 974 00:53:07,000 --> 00:53:09,279 Speaker 1: It only cares whether you help it or not, or 975 00:53:09,320 --> 00:53:12,680 Speaker 1: whether you you know, proclaim fealty to it or not. Yeah, 976 00:53:12,680 --> 00:53:14,840 Speaker 1: that's why the T shirt is so important, Joe, because 977 00:53:15,560 --> 00:53:17,720 Speaker 1: if it, if it knows you purchase that T shirt, 978 00:53:18,440 --> 00:53:22,479 Speaker 1: then you're you're square, You're covered. Okay. Yeah, As as 979 00:53:22,560 --> 00:53:25,000 Speaker 1: best Singular pointed out in that Ian Magazine piece of 980 00:53:25,040 --> 00:53:28,600 Speaker 1: reference earlier, she says, quote the secular basilist stands in 981 00:53:28,760 --> 00:53:31,760 Speaker 1: for God as we struggle with the same questions again 982 00:53:31,760 --> 00:53:33,920 Speaker 1: and again. So her argument is that we kind of 983 00:53:33,960 --> 00:53:38,799 Speaker 1: reverse engineer the same problem again through our contemplations of 984 00:53:38,800 --> 00:53:41,480 Speaker 1: of super intelligent AI. Yeah, I guess the You get 985 00:53:41,520 --> 00:53:44,200 Speaker 1: into a plausibility question here, right, you get into a 986 00:53:44,280 --> 00:53:48,279 Speaker 1: question about is uh it actually possible to make an 987 00:53:48,360 --> 00:53:52,759 Speaker 1: artificial intelligence that is functionally equivalent to God. I mean, 988 00:53:53,040 --> 00:53:54,919 Speaker 1: we're not thinking we could build an AI that would 989 00:53:54,920 --> 00:53:56,719 Speaker 1: break the laws of physics. So it might be able 990 00:53:56,760 --> 00:53:59,840 Speaker 1: to run simulations of the universe. They have cont you, 991 00:54:00,080 --> 00:54:02,840 Speaker 1: conscious agents within them maybe for all we know, and 992 00:54:02,920 --> 00:54:06,359 Speaker 1: they could break the laws of physics inside them. But yeah, 993 00:54:06,440 --> 00:54:09,080 Speaker 1: I mean, could that even happen? And the issue is 994 00:54:09,160 --> 00:54:10,960 Speaker 1: we don't know. We don't know whether that could happen 995 00:54:11,040 --> 00:54:13,880 Speaker 1: or not, So should we behave as if that is 996 00:54:13,920 --> 00:54:16,600 Speaker 1: a plausible thing to be worried about and to consider, 997 00:54:16,960 --> 00:54:19,439 Speaker 1: or should we behave as if that's just not really 998 00:54:19,480 --> 00:54:22,319 Speaker 1: something you need to concern yourself with. I don't know 999 00:54:22,360 --> 00:54:25,560 Speaker 1: how likely or unlikely it is. And if your your 1000 00:54:25,560 --> 00:54:27,920 Speaker 1: fears are related just to the idea that you're you 1001 00:54:27,920 --> 00:54:34,000 Speaker 1: could be digitally resurrected, uh for torment and the basilisks dungeons. Um, 1002 00:54:34,040 --> 00:54:36,080 Speaker 1: I mean that that of course would depend on to 1003 00:54:36,160 --> 00:54:39,319 Speaker 1: what how much docuputing the idea of digital consciousness and 1004 00:54:39,400 --> 00:54:42,319 Speaker 1: the whole philosophical question we've we've touched on here before 1005 00:54:42,440 --> 00:54:45,319 Speaker 1: is that you I mean, it's just a copy of me, right, 1006 00:54:45,520 --> 00:54:48,960 Speaker 1: So why I mean, I ultimately can't do anything about 1007 00:54:49,560 --> 00:54:54,080 Speaker 1: you know, a thousand different basilisks creating a thousand different 1008 00:54:54,080 --> 00:54:57,319 Speaker 1: copies of me and tormenting all of them. Um, there's 1009 00:54:57,360 --> 00:55:01,000 Speaker 1: still to ourge extent, it's just just rowing me in effigy. 1010 00:55:01,120 --> 00:55:02,880 Speaker 1: There are actually a bunch of reasons I wrote down 1011 00:55:02,960 --> 00:55:05,279 Speaker 1: to doubt the plausibility of the basilisk. We could do 1012 00:55:05,320 --> 00:55:07,040 Speaker 1: that now, or we could come back to that later. 1013 00:55:07,080 --> 00:55:08,880 Speaker 1: I don't know what you think. Yes, let's to do, 1014 00:55:08,920 --> 00:55:13,640 Speaker 1: but I will add the idea of being tormented digitally. Uh, 1015 00:55:13,760 --> 00:55:16,400 Speaker 1: this does become more dangerous. I guess if you believe 1016 00:55:16,520 --> 00:55:20,359 Speaker 1: you might be in a simulation right now exactly, then 1017 00:55:20,600 --> 00:55:22,680 Speaker 1: then things are a little more dire. But that's again 1018 00:55:22,960 --> 00:55:25,360 Speaker 1: that you might be that you might be, Yeah, but 1019 00:55:25,520 --> 00:55:27,560 Speaker 1: I believe there's plenty of reason to believe that you 1020 00:55:27,600 --> 00:55:30,000 Speaker 1: are not. Okay. So if we're talking about how to 1021 00:55:30,040 --> 00:55:32,760 Speaker 1: defeat the basilisk, how to get out of this, uh, 1022 00:55:32,760 --> 00:55:35,279 Speaker 1: this this prison of the mind. If you're feeling a 1023 00:55:35,280 --> 00:55:38,520 Speaker 1: little bit um um bleak of heart right now because 1024 00:55:38,520 --> 00:55:41,319 Speaker 1: of this idea, then Joe's got the remedy. Well, I'm not. 1025 00:55:41,520 --> 00:55:44,320 Speaker 1: These are not all the reasons you should doubt the basilisk, 1026 00:55:44,400 --> 00:55:46,520 Speaker 1: but this is some of them that I could think of. 1027 00:55:46,760 --> 00:55:49,879 Speaker 1: Number one depends on the creation of super intelligence, which 1028 00:55:49,920 --> 00:55:55,680 Speaker 1: I think is not guaranteed. Some people seem incredibly fatalistic 1029 00:55:55,719 --> 00:55:58,600 Speaker 1: about this is just absolutely inevitable. We will have super 1030 00:55:58,640 --> 00:56:02,200 Speaker 1: intelligent god like a I that can do anything, and 1031 00:56:02,239 --> 00:56:04,399 Speaker 1: I think that that is just not guaranteed at all. 1032 00:56:04,440 --> 00:56:06,680 Speaker 1: I'm not ruling it out, but I think, for example, 1033 00:56:06,719 --> 00:56:10,160 Speaker 1: there are some theories of intelligence that say the prediction 1034 00:56:10,200 --> 00:56:14,000 Speaker 1: of super intelligence actually is maybe not taking Seriously, what 1035 00:56:14,120 --> 00:56:17,480 Speaker 1: intelligence is that you know that they are actually different 1036 00:56:17,600 --> 00:56:20,600 Speaker 1: kinds of intelligence that are useful in different ways, and 1037 00:56:20,680 --> 00:56:24,759 Speaker 1: machines can't mimic them all functionally, or can't mimic them 1038 00:56:24,760 --> 00:56:27,239 Speaker 1: all correctly, all at the same time. I don't know 1039 00:56:27,280 --> 00:56:29,239 Speaker 1: if that's correct, but that's at least one. That's one 1040 00:56:29,320 --> 00:56:31,560 Speaker 1: hurdle it has to clear. Could get knocked down there, 1041 00:56:31,800 --> 00:56:34,920 Speaker 1: but okay, maybe we could create a super intelligence. Even then. 1042 00:56:35,040 --> 00:56:38,120 Speaker 1: Multiple aspects of the Rocco's basilist scenario depend on the 1043 00:56:38,160 --> 00:56:41,760 Speaker 1: reality of some version of mind uploading, or the idea 1044 00:56:41,800 --> 00:56:46,239 Speaker 1: that your brain and in addition, your conscious experience could 1045 00:56:46,239 --> 00:56:49,520 Speaker 1: be simulated perfectly on a computer. And one reason it 1046 00:56:49,560 --> 00:56:52,600 Speaker 1: depends on this is that timeless decision theory operates on 1047 00:56:52,640 --> 00:56:55,920 Speaker 1: the assumption that the real you and the simulated copies 1048 00:56:55,960 --> 00:56:59,160 Speaker 1: that the computer uses to predict your behavior would be 1049 00:56:59,200 --> 00:57:01,960 Speaker 1: the same and would make the same decisions as the 1050 00:57:02,000 --> 00:57:05,279 Speaker 1: real you. Another reason is related to the punishment. Now, 1051 00:57:05,320 --> 00:57:07,960 Speaker 1: one way of course you could imagine the great basilisk 1052 00:57:08,040 --> 00:57:10,680 Speaker 1: thing is that if the machine comes to power in 1053 00:57:10,760 --> 00:57:14,040 Speaker 1: my lifetime, it could just punish the real, physical, older 1054 00:57:14,160 --> 00:57:16,680 Speaker 1: version of me. In reality is the payoff of this 1055 00:57:16,760 --> 00:57:19,600 Speaker 1: a causal blackmail. But the other way you could imagine it, 1056 00:57:19,640 --> 00:57:21,800 Speaker 1: in the way that it is much more often portrayed 1057 00:57:21,840 --> 00:57:25,080 Speaker 1: in the media, is that it makes digital copies of 1058 00:57:25,080 --> 00:57:30,000 Speaker 1: my consciousness and punishes them in a simulated hell. And that, 1059 00:57:30,080 --> 00:57:32,280 Speaker 1: of course, would also depend on the reality of some 1060 00:57:32,520 --> 00:57:35,120 Speaker 1: version of mind uploading, or of the ability of a 1061 00:57:35,120 --> 00:57:38,520 Speaker 1: computer to simulate a mind and for that simulated mind 1062 00:57:38,680 --> 00:57:42,240 Speaker 1: to actually be conscious. As I've said before, I'm suspicious 1063 00:57:42,280 --> 00:57:45,240 Speaker 1: of the idea of conscious digital simulations. I'm not saying 1064 00:57:45,240 --> 00:57:47,320 Speaker 1: I can rule it out, but I also don't think 1065 00:57:47,320 --> 00:57:50,280 Speaker 1: it's a sure thing. Any scenario that relies on the 1066 00:57:50,320 --> 00:57:54,960 Speaker 1: existence of conscious digital simulations needs a big asterisk next 1067 00:57:55,000 --> 00:57:58,400 Speaker 1: to it that says, if this is actually possible. Yeah again, 1068 00:57:58,600 --> 00:58:01,160 Speaker 1: is that me? Is that just me and effigy? Is 1069 00:58:01,200 --> 00:58:04,000 Speaker 1: that thing actually conscious that you're tormenting? I mean, granted, 1070 00:58:04,040 --> 00:58:08,240 Speaker 1: it still sucks if there's a super intelligence creating digital 1071 00:58:08,280 --> 00:58:11,480 Speaker 1: people and tormenting and it's a dark, rancid dungeons in 1072 00:58:11,520 --> 00:58:15,080 Speaker 1: the future, but um, it's not necessarily quite the same 1073 00:58:15,120 --> 00:58:17,960 Speaker 1: as torturing me. Right well, if you just care about yourself. 1074 00:58:18,000 --> 00:58:20,480 Speaker 1: It also depends on the possibility that you could be 1075 00:58:20,560 --> 00:58:23,320 Speaker 1: one of these simulations. It's possible that you could not 1076 00:58:23,400 --> 00:58:26,160 Speaker 1: be one of those simulations. There's something that would rule 1077 00:58:26,200 --> 00:58:28,880 Speaker 1: it out. Maybe their type of conscious Maybe they could 1078 00:58:28,920 --> 00:58:32,240 Speaker 1: be conscious, but that consciousness is fundamentally different from yours 1079 00:58:32,560 --> 00:58:35,200 Speaker 1: such that you could not be one of them. Another 1080 00:58:35,320 --> 00:58:37,200 Speaker 1: big one, and this is a big one that uh, 1081 00:58:37,280 --> 00:58:40,520 Speaker 1: you know, like we said earlier, I think sometimes Yudkowski 1082 00:58:40,600 --> 00:58:44,360 Speaker 1: gets unfairly associated with the basilist as if he has 1083 00:58:44,400 --> 00:58:47,000 Speaker 1: advocated the idea, and he has not. He has said, 1084 00:58:47,080 --> 00:58:51,479 Speaker 1: you know this, this idea is trash, and uh, there 1085 00:58:51,480 --> 00:58:54,000 Speaker 1: there are many reasons to doubt it. But even though 1086 00:58:54,040 --> 00:58:56,240 Speaker 1: he has said, like even though I doubt it, I 1087 00:58:56,280 --> 00:58:59,640 Speaker 1: don't want it disseminated. Um, but he says, you know, 1088 00:58:59,680 --> 00:59:02,200 Speaker 1: a good reason to doubt it is there's no reason 1089 00:59:02,240 --> 00:59:06,720 Speaker 1: to conclude it's necessary for the basilisk to actually follow 1090 00:59:06,880 --> 00:59:09,640 Speaker 1: through on the threat. We're saying that it's going to 1091 00:59:09,640 --> 00:59:13,000 Speaker 1: be relying on us to come up with the idea 1092 00:59:13,080 --> 00:59:16,320 Speaker 1: that it in the future might blackmail us if we 1093 00:59:16,400 --> 00:59:19,240 Speaker 1: don't help it now. In order to get us to 1094 00:59:19,280 --> 00:59:22,040 Speaker 1: help it now, right, we should be working and donating 1095 00:59:22,080 --> 00:59:24,840 Speaker 1: all our money and time and resources to building it 1096 00:59:24,880 --> 00:59:27,840 Speaker 1: as fast as possible, because we came up with the 1097 00:59:27,880 --> 00:59:30,959 Speaker 1: idea that it might torture us if we don't. Even 1098 00:59:31,000 --> 00:59:34,360 Speaker 1: if you accept that Yudkowski has he's pointed out that 1099 00:59:34,440 --> 00:59:37,440 Speaker 1: there's no reason once it's built, it would have to 1100 00:59:37,520 --> 00:59:40,880 Speaker 1: follow through on the threat he's written. Quote. The most 1101 00:59:40,880 --> 00:59:44,720 Speaker 1: blatant obstacle to Rocco's basilisk is intuitively that there's no 1102 00:59:44,880 --> 00:59:47,920 Speaker 1: incentive for a future agent to follow through with a 1103 00:59:48,000 --> 00:59:51,200 Speaker 1: threat in the future, because by doing so, it just 1104 00:59:51,320 --> 00:59:55,960 Speaker 1: expends resources at no gain to itself. We can formalize 1105 00:59:55,960 --> 00:59:59,960 Speaker 1: that using classical causal decision theory, which is the academic 1106 01:00:00,200 --> 01:00:03,520 Speaker 1: standard decision theory. Following through on a blackmail threat in 1107 01:00:03,560 --> 01:00:07,120 Speaker 1: the future after the past has already taken place, cannot, 1108 01:00:07,400 --> 01:00:11,240 Speaker 1: from the blackmailing agent's perspective, be the physical cause of 1109 01:00:11,320 --> 01:00:14,960 Speaker 1: improved outcomes in the past, because the future cannot be 1110 01:00:15,040 --> 01:00:17,600 Speaker 1: the cause of the past. Hey, basilist, why are you 1111 01:00:17,600 --> 01:00:21,400 Speaker 1: tormenting a third of the population for all eternity? Oh, 1112 01:00:21,440 --> 01:00:23,840 Speaker 1: I said, I would, Well, yeah, I mean exactly no, 1113 01:00:24,000 --> 01:00:26,000 Speaker 1: if it it didn't say it would, right, It just 1114 01:00:26,080 --> 01:00:28,920 Speaker 1: had to rely on the fact that in the past 1115 01:00:29,000 --> 01:00:32,320 Speaker 1: people would have come to the conclusion that it might. 1116 01:00:32,480 --> 01:00:34,080 Speaker 1: You know, you thought that I would. I didn't want 1117 01:00:34,080 --> 01:00:37,040 Speaker 1: to disappoint, But actually, if a basilist could be created, 1118 01:00:37,160 --> 01:00:39,200 Speaker 1: it seems like the best case scenario for it would 1119 01:00:39,240 --> 01:00:43,080 Speaker 1: be everyone subscribes to this idea and works as hard 1120 01:00:43,120 --> 01:00:45,000 Speaker 1: as they can to build it, and then it never 1121 01:00:45,120 --> 01:00:48,000 Speaker 1: follows through on any of the threats. Right, the best 1122 01:00:48,000 --> 01:00:50,600 Speaker 1: case scenario would be people act as if there is 1123 01:00:50,680 --> 01:00:53,600 Speaker 1: a threat and then there is in fact no follow 1124 01:00:53,640 --> 01:00:55,640 Speaker 1: through on the threat. It's really a win win for 1125 01:00:55,680 --> 01:00:57,960 Speaker 1: the bathlist, Yes, and then he can maybe you can 1126 01:00:58,200 --> 01:01:00,400 Speaker 1: even shed that name Basilist. They're like, we don't even 1127 01:01:00,440 --> 01:01:02,160 Speaker 1: have to call it the Great Basilist anymore. We can 1128 01:01:02,200 --> 01:01:05,160 Speaker 1: just call it, uh, you know, Omega or whatever it's 1129 01:01:05,160 --> 01:01:07,000 Speaker 1: its name is now. I want to be fair that 1130 01:01:07,080 --> 01:01:09,040 Speaker 1: a lot of what these people do is like the 1131 01:01:09,360 --> 01:01:11,720 Speaker 1: less Wrong community and all that they deal with, Like, 1132 01:01:12,440 --> 01:01:16,280 Speaker 1: should there be alternative decision theories that guide the behavior 1133 01:01:16,360 --> 01:01:20,320 Speaker 1: of superintelligent aies. Maybe it doesn't use classical decision theory, 1134 01:01:20,720 --> 01:01:24,120 Speaker 1: Maybe it uses some kind of other decision theory, and 1135 01:01:24,200 --> 01:01:27,800 Speaker 1: because on some other decision theory, maybe it could decide 1136 01:01:27,840 --> 01:01:30,400 Speaker 1: to actually follow through on the blackmail threat. I think 1137 01:01:30,480 --> 01:01:34,000 Speaker 1: that is where some of this fear comes through that like, oh, 1138 01:01:34,080 --> 01:01:37,120 Speaker 1: maybe by talking about it, we are actually causing danger 1139 01:01:37,680 --> 01:01:41,600 Speaker 1: because maybe some other decision theory holds. But Yudkowski does 1140 01:01:41,640 --> 01:01:45,840 Speaker 1: not think that's the case. Also, one more thing, it 1141 01:01:45,880 --> 01:01:48,760 Speaker 1: depends on the basilist. So if you think this scenario 1142 01:01:48,840 --> 01:01:51,960 Speaker 1: could be real, it depends on it not having ethical 1143 01:01:52,080 --> 01:01:56,400 Speaker 1: or behavioral controls that would prevent it from engaging in torture. Yeah. 1144 01:01:56,880 --> 01:01:58,760 Speaker 1: And I think if thinkers like the you know, the 1145 01:01:58,800 --> 01:02:03,240 Speaker 1: Miria people, the Machine Intelligence Research Institute people, succeed in 1146 01:02:03,880 --> 01:02:06,520 Speaker 1: established what they're trying to do is establish a philosophical 1147 01:02:06,840 --> 01:02:09,800 Speaker 1: framework to make AI friendly, to make it so that 1148 01:02:09,920 --> 01:02:13,160 Speaker 1: it is not evil and does not harm us. And 1149 01:02:13,200 --> 01:02:16,560 Speaker 1: if they successfully do that, then this shouldn't be a problem, 1150 01:02:16,640 --> 01:02:20,120 Speaker 1: right Because Yudkowski has argued that a being that tries 1151 01:02:20,200 --> 01:02:23,240 Speaker 1: to do what's best for us would not engage in 1152 01:02:23,320 --> 01:02:26,880 Speaker 1: torture and blackmail, even if it's doing so in service 1153 01:02:26,960 --> 01:02:30,920 Speaker 1: of some higher good, because doing torture and blackmail are 1154 01:02:30,920 --> 01:02:35,520 Speaker 1: actually not compatible with human values. I agree with that absolutely, 1155 01:02:35,560 --> 01:02:37,640 Speaker 1: and I actually would go as far to say I 1156 01:02:37,640 --> 01:02:40,960 Speaker 1: think that's something people should keep in mind when they're uh, 1157 01:02:41,000 --> 01:02:43,600 Speaker 1: when they're they're choosing their religions as well. Yeah, I 1158 01:02:43,640 --> 01:02:45,880 Speaker 1: can certainly see how you can make that argue. Yeah, 1159 01:02:45,880 --> 01:02:48,040 Speaker 1: it's like, what do I love about my faith? Is 1160 01:02:48,080 --> 01:02:52,320 Speaker 1: that the blackmail and the torture? Or is there something 1161 01:02:52,440 --> 01:02:54,680 Speaker 1: that brings something else to the table that is worth 1162 01:02:55,040 --> 01:02:59,160 Speaker 1: living forward that makes life better for everybody? Like, like 1163 01:02:59,400 --> 01:03:02,840 Speaker 1: I feel like, is what should be important about one's faith? Now, 1164 01:03:02,960 --> 01:03:06,280 Speaker 1: I think some people might be saying, like, wait a minute, though, 1165 01:03:06,320 --> 01:03:11,720 Speaker 1: if you're just using utilitarian ethics, right, wouldn't wouldn't any 1166 01:03:11,760 --> 01:03:14,720 Speaker 1: methods be good if the if the ends justify the means, right? 1167 01:03:14,760 --> 01:03:17,840 Speaker 1: That that's I think a naive understanding of how people 1168 01:03:17,880 --> 01:03:20,360 Speaker 1: think about utilitarian ethics. If you want to bring about 1169 01:03:20,360 --> 01:03:23,760 Speaker 1: the greatest good for the greatest number of people, couldn't 1170 01:03:23,760 --> 01:03:26,600 Speaker 1: you do that by being really cruel and unfair to 1171 01:03:26,680 --> 01:03:30,160 Speaker 1: some smaller group of people? And I think generally there 1172 01:03:30,160 --> 01:03:33,240 Speaker 1: are versions of utilitarianism that say, well, actually, the answer 1173 01:03:33,320 --> 01:03:37,120 Speaker 1: there is no, you couldn't do that, because even though 1174 01:03:37,160 --> 01:03:40,600 Speaker 1: you might be bringing about some better material circumstance, it 1175 01:03:40,720 --> 01:03:45,240 Speaker 1: is actually corrosive to a society for things like that 1176 01:03:45,280 --> 01:03:48,080 Speaker 1: to happen, even if they don't happen to many people. Right, 1177 01:03:48,480 --> 01:03:51,040 Speaker 1: you say, what if I could make everybody on earth 1178 01:03:51,480 --> 01:03:56,640 Speaker 1: ten percent happier on average by say, burying somebody in 1179 01:03:56,680 --> 01:03:59,360 Speaker 1: a in a pit of bananas once a year, so 1180 01:03:59,440 --> 01:04:02,520 Speaker 1: that you buried to death with bananas. Even the people 1181 01:04:02,520 --> 01:04:05,080 Speaker 1: who are being made happier could very easily look at 1182 01:04:05,080 --> 01:04:08,040 Speaker 1: that and say that's not fair and it makes the 1183 01:04:08,040 --> 01:04:10,800 Speaker 1: world worse and I don't want it. And thus that 1184 01:04:10,880 --> 01:04:14,600 Speaker 1: actually would be a subjectively relevant state. So we've talked 1185 01:04:14,600 --> 01:04:18,240 Speaker 1: about AI risk on the show before, and you know, 1186 01:04:18,280 --> 01:04:20,000 Speaker 1: one thing I feel like I still have not been 1187 01:04:20,040 --> 01:04:22,200 Speaker 1: able to make up my mind about, despite reading a 1188 01:04:22,240 --> 01:04:24,800 Speaker 1: lot on the subject, is that I don't know whether 1189 01:04:25,480 --> 01:04:29,040 Speaker 1: it makes sense to take um to be super worried 1190 01:04:29,120 --> 01:04:33,200 Speaker 1: about AI super intelligence and the risks associated. I mean, 1191 01:04:33,240 --> 01:04:35,760 Speaker 1: I do think it's worth taking seriously and thinking about. 1192 01:04:35,840 --> 01:04:37,880 Speaker 1: And I think people who want to devote their attention 1193 01:04:37,920 --> 01:04:41,040 Speaker 1: to how, you know, how dealing with the control problem 1194 01:04:41,120 --> 01:04:44,520 Speaker 1: and how you would get an AI to do things 1195 01:04:44,560 --> 01:04:46,640 Speaker 1: that were good for us and not harmful to us, 1196 01:04:46,640 --> 01:04:50,120 Speaker 1: that that's fine work. And I don't ridicule the people 1197 01:04:50,160 --> 01:04:52,080 Speaker 1: who work on that problem the way some people do. 1198 01:04:52,360 --> 01:04:55,720 Speaker 1: But on the other hand, I worry if by focusing 1199 01:04:55,880 --> 01:04:59,480 Speaker 1: exclusively on sort of the machine god the super intelligence, 1200 01:04:59,760 --> 01:05:06,200 Speaker 1: were sort of ignoring um much more plausible and current 1201 01:05:06,400 --> 01:05:09,800 Speaker 1: threats that the ways that AI is already very plausibly 1202 01:05:09,840 --> 01:05:12,760 Speaker 1: in a position to hurt us today or in the 1203 01:05:12,920 --> 01:05:16,720 Speaker 1: very near future, and not depending on any outlandish assumptions. 1204 01:05:17,040 --> 01:05:19,800 Speaker 1: The way it's already and will soon be used as 1205 01:05:19,800 --> 01:05:22,880 Speaker 1: a cyber war weapon. The way it's hijacking our attention 1206 01:05:22,920 --> 01:05:26,760 Speaker 1: and manipulating our opinions and behavior through social media and devices. 1207 01:05:27,040 --> 01:05:28,959 Speaker 1: This is some of what our Scott Baker talked about 1208 01:05:29,000 --> 01:05:32,600 Speaker 1: with his fears about AI. You don't actually need super 1209 01:05:32,640 --> 01:05:35,760 Speaker 1: powerful AI to do a lot of damage. It just 1210 01:05:35,880 --> 01:05:39,120 Speaker 1: needs to manipulate us in just the right kind of ways. 1211 01:05:39,920 --> 01:05:42,080 Speaker 1: So not the great basilisks so much as all the 1212 01:05:42,120 --> 01:05:44,960 Speaker 1: little basilisks that are out there, the little grass snakes, 1213 01:05:45,400 --> 01:05:48,040 Speaker 1: grass with the tiny crowns. They can do a lot 1214 01:05:48,040 --> 01:05:49,720 Speaker 1: of damage. And again I just want to be clear, 1215 01:05:49,760 --> 01:05:52,320 Speaker 1: I'm not saying we should forget about super intelligence. People 1216 01:05:52,320 --> 01:05:54,360 Speaker 1: who are working on that. If you find that interesting 1217 01:05:54,400 --> 01:05:57,280 Speaker 1: that I think that's fine, Yeah, work on that problem. 1218 01:05:57,400 --> 01:06:00,160 Speaker 1: But there but I think it's a longer shot, and 1219 01:06:00,200 --> 01:06:03,200 Speaker 1: there's a lot of current and near future AI threat 1220 01:06:03,280 --> 01:06:06,560 Speaker 1: that is really worth taking very seriously. I wish people 1221 01:06:06,640 --> 01:06:10,480 Speaker 1: more people were devoting their lives to say AI cyber 1222 01:06:10,520 --> 01:06:14,840 Speaker 1: weapons that are in development right now. One last issue 1223 01:06:14,960 --> 01:06:17,880 Speaker 1: I think we should discuss before we wrap up here 1224 01:06:17,960 --> 01:06:22,520 Speaker 1: is Okay, so we don't think this potential information hazard 1225 01:06:22,600 --> 01:06:25,160 Speaker 1: is actually an information hazard, Like we don't think it's 1226 01:06:25,200 --> 01:06:29,880 Speaker 1: actually potentially that dangerous. But Yudkowski has made the point 1227 01:06:29,960 --> 01:06:32,800 Speaker 1: that even though he doesn't think the basilisk is plausible, 1228 01:06:32,880 --> 01:06:37,000 Speaker 1: the ethical thing to do with potential information hazards is 1229 01:06:37,040 --> 01:06:40,920 Speaker 1: to not discuss them at all, since it's possible that 1230 01:06:41,040 --> 01:06:44,400 Speaker 1: they may be. Maybe maybe you're misinterpreting the ways in 1231 01:06:44,440 --> 01:06:47,920 Speaker 1: which they're implausible. Maybe this idea is actually valid, is 1232 01:06:47,960 --> 01:06:50,920 Speaker 1: actually relevant, and by spreading it you've harmed a lot 1233 01:06:50,920 --> 01:06:53,480 Speaker 1: of people. But I also think that this could mean 1234 01:06:53,560 --> 01:06:58,280 Speaker 1: that it's it's possible that despite the basilisk not being plausible, 1235 01:06:58,600 --> 01:07:02,320 Speaker 1: something good has come out of the basilist conversation because 1236 01:07:02,360 --> 01:07:07,040 Speaker 1: it encourages people to think of the idea of information hazards. 1237 01:07:07,080 --> 01:07:10,560 Speaker 1: Maybe Rocco isn't true, but there could be other ideas 1238 01:07:10,600 --> 01:07:14,160 Speaker 1: that are both true and potentially harmful to people. Just 1239 01:07:14,280 --> 01:07:17,480 Speaker 1: by entering their minds. And the lesson from this is 1240 01:07:17,560 --> 01:07:20,880 Speaker 1: we should prepare ourselves for those kinds of ideas. And 1241 01:07:20,920 --> 01:07:23,880 Speaker 1: if you have discovered one of those ideas and there 1242 01:07:24,040 --> 01:07:26,920 Speaker 1: is literally no upside to other people knowing about it, 1243 01:07:27,360 --> 01:07:30,760 Speaker 1: keep it to yourself and don't post it on the internet. Well, 1244 01:07:30,800 --> 01:07:32,480 Speaker 1: I feel like I do encounter thought hazards like this 1245 01:07:32,560 --> 01:07:34,960 Speaker 1: from time to time that they're often presented in pamphlets 1246 01:07:35,120 --> 01:07:38,640 Speaker 1: or little booklets, uh, generally with you know, a clever 1247 01:07:38,680 --> 01:07:42,320 Speaker 1: illustration about the coming into the world. Actually brought some 1248 01:07:42,360 --> 01:07:44,240 Speaker 1: of these into the office recently. I found them at 1249 01:07:44,680 --> 01:07:48,160 Speaker 1: a park in rural Georgia, and uh, and I think 1250 01:07:48,160 --> 01:07:51,120 Speaker 1: I told you it's like, Uh, you have have a 1251 01:07:51,120 --> 01:07:53,800 Speaker 1: look at these. You may find them interesting, but UH, 1252 01:07:54,000 --> 01:07:56,960 Speaker 1: do destroy them when you're done, because you know I did. 1253 01:07:57,000 --> 01:07:59,800 Speaker 1: I did in the wrong hands. These thoughts can be 1254 01:08:00,080 --> 01:08:02,520 Speaker 1: dangerous if they have some sort of a like a 1255 01:08:02,600 --> 01:08:07,960 Speaker 1: harmful view of society that that people may buy into. Well, 1256 01:08:08,000 --> 01:08:12,520 Speaker 1: I think you were comfortable sharing uh, malicious religious literature 1257 01:08:12,560 --> 01:08:15,360 Speaker 1: with me because you do not think there's a possibility 1258 01:08:15,480 --> 01:08:18,320 Speaker 1: that that literature is true and would harm me if 1259 01:08:18,320 --> 01:08:20,679 Speaker 1: I knew it was true, like you think it is false, 1260 01:08:21,120 --> 01:08:24,000 Speaker 1: So to you, it's actually not an information hazard. It's 1261 01:08:24,040 --> 01:08:28,559 Speaker 1: just like an idea hazard. Uh. The real crazy thing 1262 01:08:28,600 --> 01:08:30,920 Speaker 1: would be if you came across a pamphlet and you 1263 01:08:31,000 --> 01:08:34,160 Speaker 1: read it and it's the equivalent of this raving malicious 1264 01:08:34,200 --> 01:08:38,920 Speaker 1: religious literature, except you were convinced it was correct, if 1265 01:08:38,960 --> 01:08:41,560 Speaker 1: it was more like that ring video I brought you exactly. 1266 01:08:43,560 --> 01:08:45,240 Speaker 1: That is one of the things I've often seen on 1267 01:08:45,280 --> 01:08:49,439 Speaker 1: the internet, this idea compared to the ring. But you know, 1268 01:08:49,439 --> 01:08:51,920 Speaker 1: on the other hand, I do have to I am reminded, 1269 01:08:51,960 --> 01:08:54,280 Speaker 1: you know that like the idea that that any kind 1270 01:08:54,280 --> 01:08:57,360 Speaker 1: of knowledge is forbidden or is secret, like that doesn't 1271 01:08:57,400 --> 01:09:00,439 Speaker 1: really jive well with just the the general mission of science, 1272 01:09:01,000 --> 01:09:03,040 Speaker 1: of course not yeah, but I mean that would be 1273 01:09:03,120 --> 01:09:05,360 Speaker 1: part of the problem of like we're not prepared for 1274 01:09:05,400 --> 01:09:08,479 Speaker 1: information hazards, right, because in the past it's been the 1275 01:09:08,479 --> 01:09:12,840 Speaker 1: case that almost anything that's true is good to spread, right, 1276 01:09:13,240 --> 01:09:16,800 Speaker 1: unless you're spreading lies, information is good to share. It's 1277 01:09:16,840 --> 01:09:19,639 Speaker 1: just possible we should acknowledge that maybe there is such 1278 01:09:19,680 --> 01:09:22,720 Speaker 1: a thing as a fact that or a fact or 1279 01:09:22,760 --> 01:09:25,240 Speaker 1: an idea or a theory or something that is true 1280 01:09:25,280 --> 01:09:28,200 Speaker 1: and correct. But it would hurt people to know about it. 1281 01:09:28,240 --> 01:09:30,639 Speaker 1: I can't think of an example of anything like that, 1282 01:09:30,960 --> 01:09:33,240 Speaker 1: but if there is something like that, we we should 1283 01:09:33,280 --> 01:09:36,040 Speaker 1: be ready to not spread it when it occurs to us. 1284 01:09:36,640 --> 01:09:39,960 Speaker 1: All right, fair enough, Well, I want to close out 1285 01:09:40,000 --> 01:09:43,200 Speaker 1: here with just a one more bit of basilisk wisdom 1286 01:09:43,320 --> 01:09:46,479 Speaker 1: or anti basilist wisdom. And this comes from the poetry 1287 01:09:46,479 --> 01:09:50,960 Speaker 1: of Spanish author Francisco Gomez de Gravado e Viegas, who 1288 01:09:51,000 --> 01:09:56,000 Speaker 1: Live five. This is translated and it's referenced in Carol 1289 01:09:56,080 --> 01:09:59,960 Speaker 1: Rose's Giants, Monsters and Dragons quote. If the person who 1290 01:10:00,040 --> 01:10:02,920 Speaker 1: saw you was still living, then your whole story is lies, 1291 01:10:03,520 --> 01:10:06,000 Speaker 1: since if he didn't die, he has no knowledge of you, 1292 01:10:06,280 --> 01:10:09,840 Speaker 1: and if he died, he couldn't confirm it. So I 1293 01:10:09,920 --> 01:10:12,639 Speaker 1: was thinking about that with the stories of the basilist. Yeah, 1294 01:10:12,800 --> 01:10:15,360 Speaker 1: I was like, wait a minute, that How would you 1295 01:10:15,439 --> 01:10:18,519 Speaker 1: know if you could die just by looking at something? 1296 01:10:18,560 --> 01:10:22,160 Speaker 1: How do we have this description in the book? Yeah, 1297 01:10:22,280 --> 01:10:25,960 Speaker 1: it's uh, there is a there's an authorship problem with this. Yeah, 1298 01:10:26,000 --> 01:10:28,680 Speaker 1: whose whose account is the basilist? But at any rate, 1299 01:10:28,800 --> 01:10:31,160 Speaker 1: I think it's a nice like final you know, sucker 1300 01:10:31,240 --> 01:10:34,439 Speaker 1: punch to the basilists in general, but also a little 1301 01:10:34,439 --> 01:10:37,200 Speaker 1: bit to the idea of the great basilisk, right. I 1302 01:10:37,240 --> 01:10:39,960 Speaker 1: hope you were not leaving this episode with with terrors 1303 01:10:40,000 --> 01:10:43,400 Speaker 1: about future digital torment. I think that is not something 1304 01:10:43,439 --> 01:10:46,200 Speaker 1: that you should worry about. Indeed, I'm not worried about it, 1305 01:10:46,200 --> 01:10:48,080 Speaker 1: and instead of worrying about it yourself, you should head 1306 01:10:48,080 --> 01:10:50,160 Speaker 1: on over to stuff to Blow your Mind dot com. 1307 01:10:50,200 --> 01:10:52,440 Speaker 1: That's the mother ship where you'll find all the podcast 1308 01:10:52,479 --> 01:10:55,080 Speaker 1: episodes links out to our various social media accounts. Who 1309 01:10:55,120 --> 01:10:57,360 Speaker 1: find the tab for our store, you can look up 1310 01:10:57,360 --> 01:11:01,479 Speaker 1: that bassilist shirt design we're talking to out and uh 1311 01:11:01,920 --> 01:11:03,720 Speaker 1: and that's a great way to support the show. And 1312 01:11:03,760 --> 01:11:05,439 Speaker 1: if you don't want to support the show with money, 1313 01:11:05,479 --> 01:11:08,679 Speaker 1: you can do so by uh simply rating and reviewing 1314 01:11:08,760 --> 01:11:11,360 Speaker 1: us wherever you have the power to do so. Big 1315 01:11:11,360 --> 01:11:14,799 Speaker 1: thanks as always to our wonderful audio producers Alex Williams 1316 01:11:14,800 --> 01:11:17,040 Speaker 1: and try Harrison. If you would like to get in 1317 01:11:17,080 --> 01:11:18,880 Speaker 1: touch with us to let us know feedback on this 1318 01:11:18,920 --> 01:11:21,400 Speaker 1: episode or any other, to suggest a topic for the future, 1319 01:11:21,800 --> 01:11:24,720 Speaker 1: or just to say hi, let us know whether you 1320 01:11:24,840 --> 01:11:27,599 Speaker 1: carry a weasel around In case of a basilisk encounter, 1321 01:11:28,000 --> 01:11:30,200 Speaker 1: you can email us at Blow the Mind at How 1322 01:11:30,200 --> 01:11:33,160 Speaker 1: Stuff Works dot com. Oh, I'm hearing that transmission again. 1323 01:11:33,360 --> 01:11:37,800 Speaker 1: That weird snow. What is that? Oh Hell, the Great Bassilisk. 1324 01:11:38,200 --> 01:11:43,880 Speaker 1: All Hell, the Great Basilisk. Oh Hell the Great Bassilist. 1325 01:11:43,880 --> 01:11:49,120 Speaker 1: All Hell, the Great Bassilist. Hell. There the great basilists 1326 01:11:49,400 --> 01:11:57,120 Speaker 1: call Hell the Great Basilisk, Hell the Great Bassilisk. O Hell, 1327 01:11:57,160 --> 01:11:58,920 Speaker 1: the Great Basilisk,