1 00:00:05,160 --> 00:00:08,160 Speaker 1: Dressed in a dove gray robe, his hair now done 2 00:00:08,160 --> 00:00:12,320 Speaker 1: in boyish loops, one either side of his head. The child, 3 00:00:12,520 --> 00:00:16,759 Speaker 1: his face bathed in tears, pressed his small hands together, 4 00:00:17,320 --> 00:00:21,159 Speaker 1: knelt down and bowed first towards the east, taking his 5 00:00:21,280 --> 00:00:25,200 Speaker 1: leave of the deity of the shrine. Then he turned 6 00:00:25,239 --> 00:00:29,480 Speaker 1: towards the west began chanting the Nimbootsu, the invocation of 7 00:00:29,520 --> 00:00:33,240 Speaker 1: Amida's name. The nun then took him in her arms. 8 00:00:33,880 --> 00:00:38,400 Speaker 1: Confronting him, she said, there's another capital down there beneath 9 00:00:38,400 --> 00:00:41,760 Speaker 1: the waves. So they plunged to the bottom of the 10 00:00:41,800 --> 00:00:54,000 Speaker 1: thousand fathomed sea. Welcome to Stuff to blow your mind 11 00:00:54,160 --> 00:01:02,960 Speaker 1: from housework dot Com. Hey, you wasn't a stuff to 12 00:01:02,960 --> 00:01:05,080 Speaker 1: blow your mind. My name is Robert Lamb and I'm 13 00:01:05,160 --> 00:01:07,560 Speaker 1: Joe McCormick and Robert what was that? A reading from? 14 00:01:07,720 --> 00:01:10,280 Speaker 1: That was a reading from a work that is sometimes 15 00:01:10,319 --> 00:01:13,720 Speaker 1: referred to as the Japanese Iliad, The Tale of the 16 00:01:13,760 --> 00:01:18,760 Speaker 1: Haka fourteenth century epic Japanese poem that recounts the struggles 17 00:01:19,000 --> 00:01:23,200 Speaker 1: between the Haka and the Genji families for control of 18 00:01:23,240 --> 00:01:27,680 Speaker 1: medieval Japan. It's a tale of Samurai heroes, war and 19 00:01:28,000 --> 00:01:32,920 Speaker 1: the tragic fall of the hak family with every everything 20 00:01:32,959 --> 00:01:36,160 Speaker 1: coming to a bloody climax in the sea Battle of 21 00:01:36,240 --> 00:01:40,080 Speaker 1: Dana Lura in five Okay, so that's the sea battle 22 00:01:40,160 --> 00:01:42,360 Speaker 1: that was said to take place in the twelfth century 23 00:01:42,400 --> 00:01:44,480 Speaker 1: a d r. A long time ago, at the end 24 00:01:44,480 --> 00:01:46,640 Speaker 1: of the courtier era. And this would be, I guess, 25 00:01:46,640 --> 00:01:50,800 Speaker 1: sending us into another era of Japanese history, right, the 26 00:01:50,840 --> 00:01:54,400 Speaker 1: heir of the shogunate, the rule of the military class. 27 00:01:54,920 --> 00:01:57,720 Speaker 1: Now we should say there are multiple translations of this 28 00:01:58,120 --> 00:02:01,240 Speaker 1: of this classic epic. This is from the translation by 29 00:02:01,280 --> 00:02:05,360 Speaker 1: Burton Watson. That's right, yeah, and you know, different translate. 30 00:02:05,360 --> 00:02:07,040 Speaker 1: I looked at a few different ones and they all 31 00:02:07,080 --> 00:02:11,160 Speaker 1: add a little something different to it. For instance, sometimes 32 00:02:11,840 --> 00:02:15,480 Speaker 1: the words are there's another kingdom beneath the waves, and 33 00:02:15,520 --> 00:02:17,600 Speaker 1: I kind of like that one a little more. But 34 00:02:17,639 --> 00:02:19,839 Speaker 1: this seems to be one of the more popular translations 35 00:02:19,840 --> 00:02:22,400 Speaker 1: out there. So we're stuck with it. So why did 36 00:02:22,600 --> 00:02:26,200 Speaker 1: these why did this child leader and the people around 37 00:02:26,280 --> 00:02:29,200 Speaker 1: him have to plunge into the ocean? Well because they 38 00:02:29,240 --> 00:02:32,040 Speaker 1: were on the losing side. They were on the what 39 00:02:32,200 --> 00:02:34,440 Speaker 1: turned out to be the wrong side of of of 40 00:02:34,560 --> 00:02:37,800 Speaker 1: history at the time. So this again comes at the 41 00:02:37,880 --> 00:02:40,800 Speaker 1: end of the Hayan period seven through eleven eighty five, 42 00:02:41,120 --> 00:02:44,919 Speaker 1: and it was largely a peaceful period. Uh the Genj 43 00:02:45,200 --> 00:02:48,560 Speaker 1: were weakened in the eleven fifties following two key power 44 00:02:48,600 --> 00:02:52,680 Speaker 1: struggles in the court, and the Genji leaders involved were executed, 45 00:02:52,880 --> 00:02:56,160 Speaker 1: but two young boys were spared. You're a Tomo and 46 00:02:56,320 --> 00:02:59,400 Speaker 1: yoshiot Suni, and they and these guys they've ended up 47 00:02:59,400 --> 00:03:03,560 Speaker 1: plotting vene it's twenty years of Genj dominance followed. But 48 00:03:03,680 --> 00:03:06,840 Speaker 1: you had all these factions that were plotting against the 49 00:03:06,919 --> 00:03:10,639 Speaker 1: hey K rule, leading to revolt in eleven eighty five 50 00:03:10,720 --> 00:03:13,320 Speaker 1: years later they would finish the Hayk, bringing an end 51 00:03:13,360 --> 00:03:16,720 Speaker 1: to the court aristocracy and again beginning this age of 52 00:03:17,160 --> 00:03:19,720 Speaker 1: the shogun to hate the rule of the warrior class. 53 00:03:20,240 --> 00:03:23,679 Speaker 1: And this particular heartbreaking read here, which seriously every time 54 00:03:23,680 --> 00:03:25,720 Speaker 1: i've I've read it in preparation for this podcast, it 55 00:03:25,760 --> 00:03:28,520 Speaker 1: gives me chill bumps, um. It takes us to the 56 00:03:28,639 --> 00:03:33,000 Speaker 1: very end. So the Haka battle fleet has been annihilated, 57 00:03:33,080 --> 00:03:36,840 Speaker 1: So the Genji have completely defeated them. Right, the few survivors, 58 00:03:37,240 --> 00:03:40,400 Speaker 1: the warriors and sailors have thrown themselves into the sea 59 00:03:41,120 --> 00:03:43,760 Speaker 1: to drown instead of being captured and in this reading, 60 00:03:44,120 --> 00:03:47,640 Speaker 1: the lady knee grandmother of the emperor, which in this 61 00:03:48,240 --> 00:03:51,040 Speaker 1: in this translation she's just she's described as a nun, 62 00:03:51,320 --> 00:03:54,520 Speaker 1: but that is the grandmother. She takes the seven year 63 00:03:54,560 --> 00:03:58,000 Speaker 1: old emperor a Toku out on a boat and sees 64 00:03:58,080 --> 00:04:01,040 Speaker 1: that he is not captured by the victor, so they 65 00:04:01,160 --> 00:04:04,640 Speaker 1: drowned themselves in their defeat, and she consoles him with 66 00:04:04,680 --> 00:04:07,760 Speaker 1: this line, there is another capital beneath the waves. Yeah, 67 00:04:07,880 --> 00:04:11,640 Speaker 1: it's it's haunting and tragic and heartbreaking. Well, but but 68 00:04:11,720 --> 00:04:15,360 Speaker 1: it seeds this idea that at least in the boy 69 00:04:15,440 --> 00:04:19,320 Speaker 1: emperor's mind, that he wasn't killing himself, but he was 70 00:04:19,360 --> 00:04:24,360 Speaker 1: like transitioning to another like stage of rule. Yeah, it's 71 00:04:24,800 --> 00:04:27,560 Speaker 1: it's it's it's a heartbreaking passage for a number of reasons, 72 00:04:27,600 --> 00:04:30,880 Speaker 1: because on one hand they're describing the boy is very 73 00:04:30,960 --> 00:04:34,320 Speaker 1: regal and king like, you know, almost kind of a 74 00:04:34,400 --> 00:04:39,479 Speaker 1: holy child emperor, which on one hand makes his decision 75 00:04:40,000 --> 00:04:43,839 Speaker 1: to h or or at least acceptance of his fate 76 00:04:43,920 --> 00:04:46,880 Speaker 1: like a little more beautiful and noble. But at the 77 00:04:46,920 --> 00:04:50,000 Speaker 1: same time, you can't help it. Read that, and and 78 00:04:50,000 --> 00:04:54,320 Speaker 1: and and imagine the alternate view where it's just as 79 00:04:54,440 --> 00:04:57,719 Speaker 1: noble as the child's birth. Maybe it's just a child. 80 00:04:57,760 --> 00:05:00,240 Speaker 1: He's just a child, and he is about to to 81 00:05:00,400 --> 00:05:05,240 Speaker 1: die beneath the waves instead of being captured by the enemy. 82 00:05:05,279 --> 00:05:08,040 Speaker 1: And then there is also this idea that the world, 83 00:05:08,600 --> 00:05:13,599 Speaker 1: that the actual kingdom is just so rife with violence 84 00:05:13,960 --> 00:05:17,040 Speaker 1: and horror at this point that the kingdom beneath the waves, 85 00:05:17,080 --> 00:05:21,640 Speaker 1: the kingdom of death, is ultimately the better choice, just 86 00:05:21,800 --> 00:05:25,440 Speaker 1: complete annihilation over trying to live in this sort of 87 00:05:25,480 --> 00:05:29,000 Speaker 1: world anymore. Yeah, that sadness does come through, But also 88 00:05:29,200 --> 00:05:34,279 Speaker 1: there is this interesting suggestion of a hierarchy even after 89 00:05:34,400 --> 00:05:37,400 Speaker 1: they have drowned, because what like his servants, that the 90 00:05:37,400 --> 00:05:40,680 Speaker 1: samurai who have survived come with him, right, and they 91 00:05:40,720 --> 00:05:43,000 Speaker 1: all they drowned themselves as well. You can imagine them 92 00:05:43,040 --> 00:05:46,719 Speaker 1: and maybe heavy armor drowning in the waves with their master, 93 00:05:47,279 --> 00:05:49,520 Speaker 1: and the boy drowns with them. And the suggestion of 94 00:05:49,800 --> 00:05:51,760 Speaker 1: you mentioned that there are a couple of different translations 95 00:05:51,760 --> 00:05:54,280 Speaker 1: of that line. There's another kingdom beneath the waves, or 96 00:05:54,480 --> 00:05:57,880 Speaker 1: another capital beneath the waves. The idea of a capital 97 00:05:57,960 --> 00:06:02,320 Speaker 1: suggests there's a whole society in a hierarchy within the society, 98 00:06:02,400 --> 00:06:05,000 Speaker 1: and that you will be in this capital here like 99 00:06:05,040 --> 00:06:07,200 Speaker 1: we're going to the We're going to the big boss, 100 00:06:07,240 --> 00:06:10,039 Speaker 1: and maybe you will be the big boss. Who knows? Yeah, 101 00:06:10,080 --> 00:06:14,120 Speaker 1: I mean, especially after our recent episodes about myths of 102 00:06:14,279 --> 00:06:17,400 Speaker 1: my people and beings that live beneath the sea, Like 103 00:06:17,480 --> 00:06:21,680 Speaker 1: the magical uh ramifications of this are pretty obvious. The 104 00:06:21,720 --> 00:06:25,840 Speaker 1: idea that the fallen ruler and his followers will continue 105 00:06:25,880 --> 00:06:30,880 Speaker 1: to to to live and thrive in another magical place. Now, 106 00:06:30,960 --> 00:06:32,880 Speaker 1: this is going to be the bridge to our actual 107 00:06:32,960 --> 00:06:34,719 Speaker 1: topic today. How are we going to get from this 108 00:06:34,839 --> 00:06:40,800 Speaker 1: beautiful and sad medieval Japanese epic to some crab biology. 109 00:06:40,839 --> 00:06:44,240 Speaker 1: So members of the hey K family did survive, mostly 110 00:06:44,360 --> 00:06:46,839 Speaker 1: they were women, and they descended still remember the Battle 111 00:06:46,880 --> 00:06:49,920 Speaker 1: of Donna Lura. According to legend, however, the waters near 112 00:06:49,960 --> 00:06:53,640 Speaker 1: the battle or home to the ghosts of the drowned 113 00:06:53,720 --> 00:06:58,160 Speaker 1: hey K warriors, and those ghosts take the form of crabs. 114 00:06:59,040 --> 00:07:01,719 Speaker 1: And indeed there is a variety of crabs to be 115 00:07:01,760 --> 00:07:04,840 Speaker 1: found in these waters with a curious arrangement of ridges 116 00:07:04,920 --> 00:07:08,839 Speaker 1: on its back, ridges that seem to form the drastic 117 00:07:08,960 --> 00:07:14,120 Speaker 1: lines of a grimacing samurai warface is depicted in medieval 118 00:07:14,240 --> 00:07:17,400 Speaker 1: Japanese art. Yes, and I would say not just the 119 00:07:17,440 --> 00:07:20,920 Speaker 1: faces depicted in medieval art. But it also somewhat depicts 120 00:07:21,040 --> 00:07:25,000 Speaker 1: the Samurai masks you will sometimes see, like uh, where 121 00:07:25,160 --> 00:07:28,000 Speaker 1: a Samurai armor suit might have a helmet that would 122 00:07:28,040 --> 00:07:31,280 Speaker 1: have a mask that partially covers the face and the 123 00:07:31,360 --> 00:07:34,680 Speaker 1: backs of these crabs. The carapace of the crab looks 124 00:07:34,720 --> 00:07:37,080 Speaker 1: an awful lot like some of these masks. You have 125 00:07:37,160 --> 00:07:42,280 Speaker 1: these kind of highly stylized oh a faces, these sort 126 00:07:42,280 --> 00:07:45,120 Speaker 1: of demonic war grimaces that you see on the face 127 00:07:45,160 --> 00:07:47,680 Speaker 1: plates of the armor. Yeah. So the idea here would 128 00:07:47,680 --> 00:07:51,960 Speaker 1: be the samurais transformed into crabs, or their spirits transformed 129 00:07:51,960 --> 00:07:54,560 Speaker 1: into crabs, and that if you if you see one 130 00:07:54,560 --> 00:07:58,040 Speaker 1: of these scuttling along, then you are seeing this. Uh. 131 00:07:58,200 --> 00:08:01,760 Speaker 1: This this this remnant a fall in Samurai Warrior, a 132 00:08:01,800 --> 00:08:04,559 Speaker 1: pair of ragged claws scuttling across the floor of silent 133 00:08:04,640 --> 00:08:07,800 Speaker 1: seas exactly. Now, of course, before we go too much further, 134 00:08:08,160 --> 00:08:10,640 Speaker 1: we should let you know this is not the case. 135 00:08:11,680 --> 00:08:16,760 Speaker 1: This is the magical, mythic, legendary connotation of the story 136 00:08:16,800 --> 00:08:21,000 Speaker 1: because certainly, as marine biologist Joe W. Martin points out 137 00:08:21,000 --> 00:08:26,160 Speaker 1: in his three article The Samurai Crab published in Tara, Uh, 138 00:08:27,000 --> 00:08:29,440 Speaker 1: the myth of crab people off the coast of Japan 139 00:08:29,600 --> 00:08:32,760 Speaker 1: likely predates the Battle of Donna, or going back at 140 00:08:32,840 --> 00:08:35,720 Speaker 1: least as far as the thirteenth century, maybe even before, 141 00:08:36,120 --> 00:08:38,040 Speaker 1: and is as as is often the case with myths 142 00:08:38,040 --> 00:08:40,599 Speaker 1: and legends, it was merely adapted to the hey k 143 00:08:40,880 --> 00:08:44,040 Speaker 1: after the battle. Okay, so you've got these crabs that 144 00:08:44,080 --> 00:08:45,840 Speaker 1: you can pull up off the floor of the Silent 145 00:08:45,920 --> 00:08:49,199 Speaker 1: Sea in this area, and they look like faces. And 146 00:08:49,240 --> 00:08:52,040 Speaker 1: so he's saying that probably before this battle, people were 147 00:08:52,040 --> 00:08:54,160 Speaker 1: pulling up these crabs and saying, I see a face, 148 00:08:54,760 --> 00:08:57,400 Speaker 1: But after the battle people started to say not just 149 00:08:57,520 --> 00:09:00,120 Speaker 1: I see a face, but look, it's the face of 150 00:09:00,160 --> 00:09:03,000 Speaker 1: those Samurai warriors who drowned in these waters. Yeah, and 151 00:09:03,280 --> 00:09:06,160 Speaker 1: it makes makes perfect sense. Right. You can apply additional 152 00:09:06,320 --> 00:09:08,880 Speaker 1: narrative to the to the myth, to the legend here 153 00:09:09,120 --> 00:09:11,600 Speaker 1: and it it brings it to new life. But of course, 154 00:09:11,640 --> 00:09:14,720 Speaker 1: the reality is you can find these crabs. These crabs 155 00:09:14,880 --> 00:09:17,640 Speaker 1: are an actual species. They exist. They have nothing to 156 00:09:17,679 --> 00:09:20,960 Speaker 1: do with ghosts, but they exist, and they really really 157 00:09:21,000 --> 00:09:23,880 Speaker 1: do look like faces like a lot. Yeah, we'll try 158 00:09:23,920 --> 00:09:26,240 Speaker 1: to include some photos of these crabs on the landing 159 00:09:26,280 --> 00:09:28,320 Speaker 1: page for this episode. Stuff toblew your mind dot com. 160 00:09:28,520 --> 00:09:30,959 Speaker 1: There is an awesome painting that has included along with 161 00:09:31,040 --> 00:09:34,319 Speaker 1: Joel Martin's article from nineteen nine. It's a painting by 162 00:09:34,360 --> 00:09:39,360 Speaker 1: Utagawa Kuniyoshi, and it depicts these these drowned rulers down 163 00:09:39,400 --> 00:09:42,079 Speaker 1: at the bottom of the ocean, and they've got this 164 00:09:42,600 --> 00:09:47,000 Speaker 1: this phalanx of crabs coming towards them with those samurai 165 00:09:47,080 --> 00:09:49,640 Speaker 1: warrior faces on the backs of them. But they're lining 166 00:09:49,720 --> 00:09:53,280 Speaker 1: up almost as if to serve their new leaders. And 167 00:09:53,760 --> 00:09:56,840 Speaker 1: this painting is awesome. We've got to include this if 168 00:09:56,880 --> 00:09:59,120 Speaker 1: we can on the landing page because it has this 169 00:10:00,080 --> 00:10:07,199 Speaker 1: this manic, hyperactive, hallucinatory, hieronymous bosh kind of energy blasting 170 00:10:07,280 --> 00:10:09,480 Speaker 1: out of it. This is such a good painting. Yeah, 171 00:10:09,520 --> 00:10:12,400 Speaker 1: it's it's dramatic with its human elements within the crabs. 172 00:10:12,760 --> 00:10:16,760 Speaker 1: Add this additional scuttling horror to the whole piece. I 173 00:10:16,840 --> 00:10:20,520 Speaker 1: love it. And it captures the inherent irony of the 174 00:10:21,040 --> 00:10:25,560 Speaker 1: legend of ghosts of samurai warriors becoming becoming crabs, because 175 00:10:27,360 --> 00:10:31,440 Speaker 1: such a legend is both haunting and scary and also funny. 176 00:10:32,200 --> 00:10:34,719 Speaker 1: Crabs are funny, right, I mean, it's not just me 177 00:10:35,120 --> 00:10:37,880 Speaker 1: right like other people probably look at crabs and think 178 00:10:38,000 --> 00:10:40,280 Speaker 1: that's a kind of funny animal. The way they move, 179 00:10:41,000 --> 00:10:44,120 Speaker 1: they're they're they're walking styles, the way they wave their 180 00:10:44,280 --> 00:10:46,920 Speaker 1: their claws around like it is funny, right, Yeah, yeah, 181 00:10:46,960 --> 00:10:48,760 Speaker 1: crabs are inherently funny. I mean, the word crab is 182 00:10:48,800 --> 00:10:52,360 Speaker 1: inherently funny because of the kr sound in English. But 183 00:10:52,720 --> 00:10:55,199 Speaker 1: but yeah, the crabs are fun to look at and 184 00:10:55,679 --> 00:10:58,160 Speaker 1: and chase and to catch if you if you can 185 00:10:58,240 --> 00:11:02,120 Speaker 1: catch them, crabs are tremendous fun and they're delicious or 186 00:11:03,080 --> 00:11:06,040 Speaker 1: them are delicious. Yeah, but like you you can't take 187 00:11:06,200 --> 00:11:10,760 Speaker 1: a crab ghost or a crab monster too seriously, I think, 188 00:11:10,800 --> 00:11:13,280 Speaker 1: I don't know. Maybe maybe in Japanese culture it's different, 189 00:11:13,360 --> 00:11:16,240 Speaker 1: but at least for me, it's impossible. Like I think 190 00:11:16,320 --> 00:11:19,640 Speaker 1: of one of my favorite old horror movies from the 191 00:11:19,720 --> 00:11:23,679 Speaker 1: fifties is Attack of the Crab Monsters, the Roger Corman movie, Uh, 192 00:11:24,200 --> 00:11:28,480 Speaker 1: which just proves you know, if you're talking about killer crabs, 193 00:11:28,679 --> 00:11:32,320 Speaker 1: it's inherently funny, even if they don't look funny. Well, 194 00:11:32,360 --> 00:11:34,440 Speaker 1: they have to be gigantic to be perceived as a threat, 195 00:11:34,480 --> 00:11:36,079 Speaker 1: and so maybe that's part of the horror of the 196 00:11:36,200 --> 00:11:39,600 Speaker 1: legend here, is that the ghosts of the samurais are 197 00:11:40,360 --> 00:11:44,040 Speaker 1: trapped in this lesser form. They're they're all bluster. You know. 198 00:11:44,120 --> 00:11:46,719 Speaker 1: A crab will will wave its claws at you, but 199 00:11:46,840 --> 00:11:48,959 Speaker 1: all that can really do is run away or maybe 200 00:11:49,000 --> 00:11:52,559 Speaker 1: pinch you a little bit. It's not an actual mortal threat. 201 00:11:52,960 --> 00:11:55,120 Speaker 1: But let's take a look at what this crab species 202 00:11:55,160 --> 00:11:58,000 Speaker 1: actually is, the one with the supposed samurai face on 203 00:11:58,160 --> 00:12:00,880 Speaker 1: its carapacet. The scientific name of this crab would be 204 00:12:00,920 --> 00:12:04,880 Speaker 1: Hakia japonica a formerly known as the Dora pay crab 205 00:12:05,000 --> 00:12:07,840 Speaker 1: until it was officially granted. It's older and more traditional 206 00:12:07,960 --> 00:12:12,360 Speaker 1: name of Hakia in. And so it's got these ridges 207 00:12:12,480 --> 00:12:16,080 Speaker 1: on its back. That's the thing that captures everybody's it's attention. 208 00:12:16,160 --> 00:12:18,600 Speaker 1: You look on its back. It's got this carapace shell 209 00:12:19,200 --> 00:12:21,280 Speaker 1: on the top of it, and it looks a lot 210 00:12:21,400 --> 00:12:25,040 Speaker 1: like a face. What are these ridges that form the face? 211 00:12:25,200 --> 00:12:28,959 Speaker 1: Are they purely decorative wells Joe W. Martin points out 212 00:12:29,000 --> 00:12:32,440 Speaker 1: in that article sent the Samurai crab, they do serve 213 00:12:32,480 --> 00:12:37,679 Speaker 1: a purpose. Their external indicators of supportive ridges or epodems 214 00:12:37,840 --> 00:12:42,040 Speaker 1: inside the creatures carapacet, these are the places where muscles attached. 215 00:12:42,480 --> 00:12:45,560 Speaker 1: He points out that these features are subject to natural selection, 216 00:12:45,640 --> 00:12:48,480 Speaker 1: but they occur in nearly all members of the crab 217 00:12:48,600 --> 00:12:52,559 Speaker 1: family uh doripiday all over the world. At least seventeen 218 00:12:52,640 --> 00:12:56,200 Speaker 1: crab species in two families in the Indo West Pacific 219 00:12:56,520 --> 00:13:00,439 Speaker 1: are similar enough to be called Haika ghani by locals. 220 00:13:00,720 --> 00:13:04,160 Speaker 1: This also includes a variety of Chinese crab is known 221 00:13:04,200 --> 00:13:07,160 Speaker 1: as the ghost or demon face crab. Right, and Hayka 222 00:13:07,200 --> 00:13:09,160 Speaker 1: ghani that that would be the more common name for 223 00:13:09,240 --> 00:13:12,840 Speaker 1: this crab haka from the story we told, and ghani 224 00:13:13,000 --> 00:13:15,960 Speaker 1: the Japanese word for crab ghani or khani. Al Right, 225 00:13:16,040 --> 00:13:21,160 Speaker 1: so we've established the legend, We've established the biological reality 226 00:13:21,559 --> 00:13:24,880 Speaker 1: of the crab species. But the lingering question is is 227 00:13:24,960 --> 00:13:29,200 Speaker 1: there any connection between the two? And remarkably, uh, there's one, 228 00:13:30,000 --> 00:13:32,120 Speaker 1: or at least a couple of very famous arguments for 229 00:13:32,320 --> 00:13:36,520 Speaker 1: a connection here in actual connection between the perceived faces 230 00:13:37,080 --> 00:13:41,760 Speaker 1: on the crabs backs and the legend of the samurai crabs. Yeah. Right, 231 00:13:42,080 --> 00:13:44,559 Speaker 1: So the question is we've established what the crab is 232 00:13:44,679 --> 00:13:46,679 Speaker 1: and and what it looks like, but why does it 233 00:13:46,800 --> 00:13:49,120 Speaker 1: look that way? How did it come to resemble a 234 00:13:49,240 --> 00:13:52,280 Speaker 1: samurai mask so strongly? Or rather, maybe maybe we should 235 00:13:52,320 --> 00:13:55,760 Speaker 1: ask instead why do. We so strongly believe we see 236 00:13:55,760 --> 00:13:59,160 Speaker 1: a samurai mask when we see the crab. So to 237 00:13:59,240 --> 00:14:01,360 Speaker 1: sort those quests, Jen's out, We're gonna have to go 238 00:14:01,480 --> 00:14:04,000 Speaker 1: to our friend Carl Sagan. That's right. We're gonna take 239 00:14:04,000 --> 00:14:05,959 Speaker 1: a quick break and when we come back we will 240 00:14:06,000 --> 00:14:11,160 Speaker 1: introduce Sagan. Than alright, we're back, So, Robert, I think 241 00:14:11,160 --> 00:14:12,800 Speaker 1: it's a shame that we're never going to get to 242 00:14:12,840 --> 00:14:15,640 Speaker 1: have Carl Sagan on the podcast. It's such a loss 243 00:14:15,679 --> 00:14:19,600 Speaker 1: that he's gone. Yeah, I mean, Carl Sagan was one 244 00:14:19,680 --> 00:14:23,280 Speaker 1: of the most important science communicators of his time. For 245 00:14:23,320 --> 00:14:31,200 Speaker 1: anyone who's not familiar, he lived or n American astronomer, becausmologist, astrophysicist, astrobiologist, 246 00:14:31,480 --> 00:14:35,720 Speaker 1: and uh an, author of several books, host of the 247 00:14:35,920 --> 00:14:41,000 Speaker 1: wonderful TV series Cosmos. Sagan was one of those those 248 00:14:41,120 --> 00:14:44,920 Speaker 1: great rare people who was actually a great working scientist himself. 249 00:14:45,040 --> 00:14:47,200 Speaker 1: You know, he was an astronomer, he worked with NASA, 250 00:14:47,280 --> 00:14:51,640 Speaker 1: He did lots of interesting space research astrophysics, and at 251 00:14:51,680 --> 00:14:55,240 Speaker 1: the same time was a great science communicator. And those 252 00:14:55,280 --> 00:14:58,920 Speaker 1: are very different skills. One other name that comes to 253 00:14:59,000 --> 00:15:01,440 Speaker 1: mind when I think of that pairing is Darwin, Right, 254 00:15:01,640 --> 00:15:04,720 Speaker 1: Darwin was both a great scientist and a great science communicator. 255 00:15:04,800 --> 00:15:07,240 Speaker 1: But you don't always have those same two skills in 256 00:15:07,360 --> 00:15:11,000 Speaker 1: one person. Yeah, I mean he was. He was intelligent, charismatic, 257 00:15:11,120 --> 00:15:13,920 Speaker 1: he had the scientific pedigree, but he also had this uh, 258 00:15:14,920 --> 00:15:18,640 Speaker 1: this this this outward passion. He was able to to 259 00:15:18,760 --> 00:15:21,320 Speaker 1: appear on these television shows and you were just instantly 260 00:15:21,840 --> 00:15:24,640 Speaker 1: enraptured by what he had to say. Yeah, so let's 261 00:15:24,680 --> 00:15:27,400 Speaker 1: turn to Cosmos. I want to set the scene. It's 262 00:15:27,440 --> 00:15:30,360 Speaker 1: the fall of nineteen eighty. You are settling in to 263 00:15:30,440 --> 00:15:34,600 Speaker 1: watch episode two of this magnificent new PBS science show Cosmos. 264 00:15:34,920 --> 00:15:37,400 Speaker 1: It's hosted by Sagan. Whether or not you know Sagan 265 00:15:37,760 --> 00:15:41,200 Speaker 1: by now, if you've seen the first episode, you're already enraptured. Uh, 266 00:15:41,280 --> 00:15:42,920 Speaker 1: you want to hear what he has to say next. 267 00:15:43,320 --> 00:15:46,000 Speaker 1: And so episode two of the cot in the original 268 00:15:46,080 --> 00:15:49,640 Speaker 1: Cosmos series in nineteen eighty starts with Sagan telling the 269 00:15:49,760 --> 00:15:52,080 Speaker 1: story we started with today. He starts to tell the 270 00:15:52,160 --> 00:15:56,119 Speaker 1: story of the Battle of dano Ura and its legendary aftermath. 271 00:15:56,480 --> 00:15:57,840 Speaker 1: And not only does he tell it, but there is 272 00:15:57,920 --> 00:16:01,520 Speaker 1: a dramatic reenactment of anything. It's it's beautiful to watch. 273 00:16:01,560 --> 00:16:04,800 Speaker 1: Will include a link to this episode of Cosmos on 274 00:16:04,920 --> 00:16:07,440 Speaker 1: the landing page of this episode. Right, So we're going 275 00:16:07,480 --> 00:16:10,440 Speaker 1: to quote from Carl Sagan's explanation of what's going on 276 00:16:10,600 --> 00:16:13,560 Speaker 1: with the crab and the legends. So he says, quote, 277 00:16:14,120 --> 00:16:17,920 Speaker 1: this legend raises a lovely problem. How does it come 278 00:16:18,000 --> 00:16:20,480 Speaker 1: about that the face of a warrior is cut on 279 00:16:20,560 --> 00:16:23,680 Speaker 1: the carapace of a Japanese crab? How could it be? 280 00:16:24,600 --> 00:16:28,360 Speaker 1: The answer seems to be that humans made this face. 281 00:16:29,160 --> 00:16:32,880 Speaker 1: But how like many other features, the patterns on the 282 00:16:32,960 --> 00:16:37,280 Speaker 1: back or carapasts of this crab are inherited. But among crabs, 283 00:16:37,520 --> 00:16:42,520 Speaker 1: as among humans, there are many different hereditary lines. Now suppose, 284 00:16:42,840 --> 00:16:46,520 Speaker 1: purely by chance, among the distant ancestors of this crab, 285 00:16:46,880 --> 00:16:49,360 Speaker 1: there came to be one that looked just a little 286 00:16:49,400 --> 00:16:52,960 Speaker 1: bit like a human face. Long before the Battle of 287 00:16:53,080 --> 00:16:56,640 Speaker 1: Danna or he's talking about, fishermen may have been reluctant 288 00:16:56,720 --> 00:16:59,720 Speaker 1: to eat a crab with a human face and throwing 289 00:16:59,800 --> 00:17:02,600 Speaker 1: it back into the sea. They were setting into motion 290 00:17:02,680 --> 00:17:07,080 Speaker 1: a process of selection. If you're a crab and your 291 00:17:07,200 --> 00:17:10,760 Speaker 1: carapist is just ordinary, the humans are going to eat you, 292 00:17:12,119 --> 00:17:14,560 Speaker 1: But if it looked like a face, they'll throw you 293 00:17:14,680 --> 00:17:17,000 Speaker 1: back and you'll be able to have lots of nice 294 00:17:17,080 --> 00:17:20,840 Speaker 1: little baby crabs that all look just like you. As 295 00:17:20,880 --> 00:17:24,680 Speaker 1: many generations passed of crabs and fisher folk alike, the 296 00:17:24,800 --> 00:17:28,040 Speaker 1: crabs with patterns that looked most like a samurai face 297 00:17:28,400 --> 00:17:32,960 Speaker 1: preferentially survived, until eventually there was produced not just a 298 00:17:33,080 --> 00:17:36,600 Speaker 1: human face, not just a Japanese face, but the face 299 00:17:36,840 --> 00:17:41,000 Speaker 1: of a Samurai warrior. Now that's that's an incredible idea. Yeah, 300 00:17:41,200 --> 00:17:44,600 Speaker 1: it's it's the idea of artificial selection. Sagan was saying 301 00:17:44,640 --> 00:17:47,840 Speaker 1: that by accident, the fisher folk of Japan for many 302 00:17:47,920 --> 00:17:52,800 Speaker 1: generations had been breeding crabs that looked like samurai in 303 00:17:52,880 --> 00:17:56,640 Speaker 1: the same way that we breed agricultural crops or agricultural 304 00:17:56,760 --> 00:18:00,480 Speaker 1: animals for desired traits. You might breed cows or pigs 305 00:18:00,560 --> 00:18:03,119 Speaker 1: to produce more milk or to have more meat. You 306 00:18:03,240 --> 00:18:05,680 Speaker 1: might breed dogs to look a certain way or to 307 00:18:05,760 --> 00:18:08,520 Speaker 1: be more friendly or tow herd sheep. It would be 308 00:18:08,640 --> 00:18:12,320 Speaker 1: like if we bread pugs because we didn't want to 309 00:18:12,440 --> 00:18:14,399 Speaker 1: eat them, like, we just don't eat any of the 310 00:18:14,520 --> 00:18:17,440 Speaker 1: dogs that look kind of like grotesque human babies. And 311 00:18:17,480 --> 00:18:19,320 Speaker 1: then you just have pugs running all over the place 312 00:18:19,400 --> 00:18:22,000 Speaker 1: because they they may be delicious, but they look too 313 00:18:22,080 --> 00:18:24,320 Speaker 1: much like babies. Right. Well, there's a lot of thought 314 00:18:24,400 --> 00:18:27,679 Speaker 1: about how we began to breed dogs, right How did 315 00:18:27,760 --> 00:18:32,840 Speaker 1: domestic dogs become separated from their wolf like ancestors. And 316 00:18:33,040 --> 00:18:35,240 Speaker 1: I think a lot of the thinking about that is 317 00:18:35,400 --> 00:18:39,320 Speaker 1: that the process did not begin intentionally, That we started 318 00:18:39,440 --> 00:18:44,000 Speaker 1: breeding dogs by accident, selecting for certain traits by accident, 319 00:18:44,119 --> 00:18:47,600 Speaker 1: before we started breeding for certain traits on purpose. For example, 320 00:18:48,080 --> 00:18:51,320 Speaker 1: we might have been breeding for the wolf like ancestor 321 00:18:51,400 --> 00:18:55,640 Speaker 1: of a dog that had more approach behaviors toward humans 322 00:18:56,080 --> 00:18:59,800 Speaker 1: because if this if it had more approach behaviors toward humans, 323 00:18:59,840 --> 00:19:01,879 Speaker 1: it would come closer, would be more likely to get 324 00:19:01,960 --> 00:19:05,320 Speaker 1: some scraps from our campsite or something like that. And 325 00:19:05,440 --> 00:19:08,479 Speaker 1: the dogs that had less approach behaviors towards humans, who 326 00:19:08,520 --> 00:19:10,560 Speaker 1: were more wary and wanted to stay farther away would 327 00:19:10,600 --> 00:19:13,680 Speaker 1: not get that extra food, would be less likely to survive. 328 00:19:14,000 --> 00:19:18,200 Speaker 1: And over time we were accidentally artificially selecting for dogs 329 00:19:18,320 --> 00:19:21,840 Speaker 1: that like getting close to bipedal primates. And then when 330 00:19:21,880 --> 00:19:25,239 Speaker 1: we actually begin involving ourselves and the decisions that when 331 00:19:25,320 --> 00:19:27,920 Speaker 1: we that's when we start saying, well, let's Let's use 332 00:19:28,000 --> 00:19:30,160 Speaker 1: these dogs at a little smaller so they can get 333 00:19:30,200 --> 00:19:33,040 Speaker 1: through the wall and eat the rodents that are disturbing 334 00:19:33,080 --> 00:19:35,440 Speaker 1: our grain crops. That sort of thing, Yeah, or just 335 00:19:35,760 --> 00:19:38,000 Speaker 1: I mean, it can be purely aesthetic. You might say, 336 00:19:38,160 --> 00:19:40,960 Speaker 1: this dog is very cute. I really like the way 337 00:19:41,040 --> 00:19:43,800 Speaker 1: it looks. I I want to see more dogs like it. 338 00:19:43,960 --> 00:19:46,800 Speaker 1: Let's breathe this dog and make it have lots of babies, 339 00:19:47,080 --> 00:19:48,680 Speaker 1: and then things get out of control and you wind 340 00:19:48,760 --> 00:19:51,560 Speaker 1: up with the pug anyway, Right, if only you could 341 00:19:51,560 --> 00:19:55,479 Speaker 1: see the pug equivalent of of what our agricultural crops 342 00:19:55,600 --> 00:20:00,359 Speaker 1: look like genetically, the crazy breeding process is that have 343 00:20:00,480 --> 00:20:03,520 Speaker 1: gone into creating the bananas and the corn and all 344 00:20:03,600 --> 00:20:05,640 Speaker 1: the stuff we eat. It's one of the funny things 345 00:20:05,720 --> 00:20:10,320 Speaker 1: about people's complaints about genetically modified organisms and food crops 346 00:20:10,480 --> 00:20:13,240 Speaker 1: is that the food crops we eat today are so 347 00:20:13,720 --> 00:20:18,439 Speaker 1: amazingly genetically modified from their ancestral natural variants. Yeah, it's 348 00:20:18,440 --> 00:20:20,600 Speaker 1: a little late in the game in any respects to 349 00:20:20,880 --> 00:20:22,719 Speaker 1: start saying, oh, we don't want to we don't want 350 00:20:22,760 --> 00:20:28,000 Speaker 1: to control or dictate what these organisms, uh manifest as 351 00:20:28,080 --> 00:20:30,720 Speaker 1: like we've been doing that throughout recorded history and before 352 00:20:30,760 --> 00:20:33,119 Speaker 1: recorded history. Yeah, but anyway, we need to get back 353 00:20:33,119 --> 00:20:35,880 Speaker 1: to the crabs. So Sagan is talking about the fact 354 00:20:36,000 --> 00:20:38,920 Speaker 1: that this is this is an idea, it's a hypothesis 355 00:20:39,040 --> 00:20:41,960 Speaker 1: to explain why the carapace of this crab looks so 356 00:20:42,240 --> 00:20:44,879 Speaker 1: much like a human face, and specifically so much like 357 00:20:44,960 --> 00:20:48,320 Speaker 1: a samurai face, that it's not just a coincidence, but 358 00:20:48,480 --> 00:20:52,800 Speaker 1: that it has been artificially selected for by human sorting 359 00:20:52,920 --> 00:20:56,680 Speaker 1: practices in fishing. Now, Sagan was not the originator of 360 00:20:56,760 --> 00:21:00,720 Speaker 1: this idea. No. Sagan's idea apparently comes a originally from 361 00:21:00,880 --> 00:21:05,119 Speaker 1: the British zoologist Sir Julian Huxley, of the of the 362 00:21:05,200 --> 00:21:08,600 Speaker 1: famous Huxley family. That's right, he was the grandson of T. H. Huxley, 363 00:21:08,600 --> 00:21:12,200 Speaker 1: also known as Darwin's Bulldog. Right, And so Huxley wrote 364 00:21:12,240 --> 00:21:14,800 Speaker 1: an article that was actually believed it or not published 365 00:21:14,840 --> 00:21:19,920 Speaker 1: in Life magazine and on June nineteen fifty two. And 366 00:21:20,560 --> 00:21:23,600 Speaker 1: reading the words of Julian Huxley in Life magazine in 367 00:21:23,680 --> 00:21:26,119 Speaker 1: nineteen fifty two is really funny because I found a 368 00:21:26,240 --> 00:21:28,960 Speaker 1: scan of the original magazine and it's got all these 369 00:21:29,040 --> 00:21:32,320 Speaker 1: like carnation instant milk ads right across from him, and 370 00:21:32,440 --> 00:21:36,679 Speaker 1: the like pard dog food ads and weird, weird recipes 371 00:21:36,760 --> 00:21:40,920 Speaker 1: of the fifties that involved what cooking cooking tuna with 372 00:21:41,480 --> 00:21:45,000 Speaker 1: with with carnation instant milk. It's like the only carnation 373 00:21:45,040 --> 00:21:48,800 Speaker 1: instant milk will make this amazing tunic casserole. Anyway, So 374 00:21:48,960 --> 00:21:50,920 Speaker 1: he's writing in life and he writes the same idea 375 00:21:50,920 --> 00:21:54,600 Speaker 1: and an article that's more generally about imitation in nature. 376 00:21:55,160 --> 00:21:58,200 Speaker 1: But he writes, quote the resemblance of Dora Pay and 377 00:21:58,320 --> 00:22:00,200 Speaker 1: remember that was the original, that was the name Sam 378 00:22:00,240 --> 00:22:03,119 Speaker 1: they were using for this crab back then. The resemblance 379 00:22:03,160 --> 00:22:06,720 Speaker 1: of Dora Pay to an angry Japanese warrior is far 380 00:22:06,840 --> 00:22:10,600 Speaker 1: too specific and far too detailed to be accidental. It 381 00:22:10,760 --> 00:22:14,640 Speaker 1: came about because those crabs with a more perfect resemblance 382 00:22:14,680 --> 00:22:19,040 Speaker 1: to a warrior's face were less frequently eaten than the others. 383 00:22:19,520 --> 00:22:22,359 Speaker 1: So again, this is an elegant theory, you know, it 384 00:22:23,240 --> 00:22:25,800 Speaker 1: makes a certain amount of logical sense. We can all 385 00:22:26,320 --> 00:22:32,480 Speaker 1: envision the scenario taking place, even without a dramatic reinterpretation 386 00:22:32,760 --> 00:22:37,080 Speaker 1: from Cosmos. We can we can see the fisher folk 387 00:22:37,200 --> 00:22:39,119 Speaker 1: pulling up these crabs, looking at him and going, oh, 388 00:22:39,240 --> 00:22:41,560 Speaker 1: that one's that's a little bit too much like a face. 389 00:22:42,160 --> 00:22:43,560 Speaker 1: For me to eat it. I'm just gonna throw it 390 00:22:43,640 --> 00:22:45,840 Speaker 1: back and we'll just see what the next one looks like. Right, 391 00:22:45,920 --> 00:22:48,920 Speaker 1: I mean, you can imagine that any food animal that 392 00:22:49,080 --> 00:22:52,960 Speaker 1: looked unnaturally human probably would end up getting selected for 393 00:22:53,200 --> 00:22:57,880 Speaker 1: in this way, Right, But is it really true? Does 394 00:22:57,960 --> 00:23:00,760 Speaker 1: it stand the test of time, time, and the test 395 00:23:01,160 --> 00:23:06,200 Speaker 1: of additional inquiry into the the origins of the crabs 396 00:23:06,840 --> 00:23:09,720 Speaker 1: weird samurai face? Right? So for the rest of the episode, 397 00:23:09,840 --> 00:23:12,480 Speaker 1: we're going to try to address this question. Is the 398 00:23:12,600 --> 00:23:18,520 Speaker 1: Sagan Huxley hypothesis correct? Was it actually artificial selection by 399 00:23:18,840 --> 00:23:22,240 Speaker 1: fisher folk being creeped out by faces that made the 400 00:23:22,359 --> 00:23:25,400 Speaker 1: crabs look like this? Or is it just a coincidence? 401 00:23:25,440 --> 00:23:30,040 Speaker 1: And if it's just a coincidence, what explains this striking resemblance. Well, 402 00:23:30,040 --> 00:23:34,359 Speaker 1: I've already mentioned marine biologist Joe W. Martin's article, and 403 00:23:34,440 --> 00:23:37,480 Speaker 1: he drops one fact that I think definitely uh is 404 00:23:37,480 --> 00:23:41,560 Speaker 1: an argument against the idea that that humans were were 405 00:23:41,880 --> 00:23:45,359 Speaker 1: artificially selecting for these these faces and the crabs, And 406 00:23:45,400 --> 00:23:47,960 Speaker 1: that's that we have fossils of directed crabs or closely 407 00:23:48,040 --> 00:23:50,960 Speaker 1: related crab species. They date back to times before the 408 00:23:51,040 --> 00:23:54,960 Speaker 1: emergence of humans. Oh okay, So if these crabs were, 409 00:23:55,320 --> 00:23:59,400 Speaker 1: or if some related crabs were looking like human faces 410 00:23:59,560 --> 00:24:03,160 Speaker 1: before humans could have been selecting for them, that's definitely 411 00:24:03,240 --> 00:24:06,320 Speaker 1: going to be a mark against the artificial selection hypothesis, right, 412 00:24:06,440 --> 00:24:08,600 Speaker 1: unless you take you have to go through, you know, 413 00:24:08,720 --> 00:24:10,760 Speaker 1: jump through elaborate hoops to say, well, what if an 414 00:24:10,800 --> 00:24:16,160 Speaker 1: alien species came down saw the faces of existing prominides, 415 00:24:16,680 --> 00:24:20,840 Speaker 1: or perhaps they had hominid uh facial features themselves, and 416 00:24:20,920 --> 00:24:24,120 Speaker 1: then they engineered it into the backs of the crab, etcetera. 417 00:24:24,359 --> 00:24:26,159 Speaker 1: You have to have some sort of elaborate explanation like 418 00:24:26,240 --> 00:24:30,840 Speaker 1: that that that that breaks the multiple rules of the 419 00:24:30,920 --> 00:24:34,600 Speaker 1: natural world. Then again, this doesn't necessarily kill the hypothesis, 420 00:24:34,640 --> 00:24:37,840 Speaker 1: because you could say that those some related crabs in 421 00:24:37,920 --> 00:24:40,920 Speaker 1: this family have some features on their backs that do 422 00:24:41,040 --> 00:24:45,000 Speaker 1: look kind of like faces. The striking resemblance of the 423 00:24:45,480 --> 00:24:49,520 Speaker 1: Hakia crabs, specifically to a samurai warrior face could have 424 00:24:49,600 --> 00:24:53,119 Speaker 1: been honed by artificial selection over time, right, There might 425 00:24:53,160 --> 00:24:56,480 Speaker 1: have been an initial resemblance that was sharpened by artificial 426 00:24:56,520 --> 00:25:01,359 Speaker 1: selection exactly. Now. Another huge detail considering, though, according to Martin, 427 00:25:01,520 --> 00:25:05,200 Speaker 1: is that this one's really hard to shake. Fisher Folk 428 00:25:05,240 --> 00:25:07,919 Speaker 1: are not in the habit of catching these crabs at 429 00:25:07,920 --> 00:25:11,399 Speaker 1: all because they only reach a size of about thirty 430 00:25:11,440 --> 00:25:13,920 Speaker 1: one millimeters or one point two inches. I want to 431 00:25:14,000 --> 00:25:16,760 Speaker 1: come back to that point when I get into another 432 00:25:16,840 --> 00:25:19,080 Speaker 1: criticism in a bit. Yes, yeah, I have some I 433 00:25:19,119 --> 00:25:21,760 Speaker 1: have some additional notes on that as well. But the 434 00:25:21,840 --> 00:25:23,800 Speaker 1: idea here is that they're not really worth the trouble 435 00:25:23,880 --> 00:25:27,040 Speaker 1: with retrieving from the nets, uh, let alone sorting through 436 00:25:27,160 --> 00:25:29,200 Speaker 1: to see which ones resemble a face or not, because 437 00:25:29,280 --> 00:25:32,160 Speaker 1: ultimately you don't care because you you have no culinary 438 00:25:32,280 --> 00:25:34,959 Speaker 1: used for them. I mean, my very brief argument against 439 00:25:35,040 --> 00:25:38,240 Speaker 1: this is how about popcorn shrimp? All right, we'll return 440 00:25:38,280 --> 00:25:40,639 Speaker 1: to this in a minute, the idea of eating the 441 00:25:40,720 --> 00:25:44,760 Speaker 1: samurai crabs? But well, what does what does Richard Dawkins 442 00:25:44,840 --> 00:25:46,920 Speaker 1: have to say? What does Papa Dawkins have to share 443 00:25:47,000 --> 00:25:49,600 Speaker 1: on this topic? Well, Dawkins has an interesting take on it. 444 00:25:49,760 --> 00:25:54,159 Speaker 1: So Dawkins has a section on Hakia Jeponica in his 445 00:25:54,280 --> 00:25:57,040 Speaker 1: two thousand nine book The Greatest Show on Earth, which 446 00:25:57,080 --> 00:25:59,720 Speaker 1: I would recommend. That's a really good That's like after 447 00:25:59,840 --> 00:26:02,720 Speaker 1: he got done talking about religion for a while and 448 00:26:02,800 --> 00:26:06,080 Speaker 1: went back to writing awesome biology books. So when Dawkins 449 00:26:06,160 --> 00:26:08,920 Speaker 1: brings up the theory. He mentions first that he says 450 00:26:09,000 --> 00:26:11,880 Speaker 1: it's quote a lovely theory, too good to easily die. 451 00:26:12,600 --> 00:26:14,440 Speaker 1: But then he goes on to undercut it. So he 452 00:26:14,560 --> 00:26:17,600 Speaker 1: describes coming across an online poll which allows you to 453 00:26:17,680 --> 00:26:21,000 Speaker 1: say which of the following you believe? Now, Robert, you 454 00:26:21,080 --> 00:26:24,679 Speaker 1: think about the options here? First option, the Sagan Huxley 455 00:26:24,800 --> 00:26:30,080 Speaker 1: theory is correct. Of respondents agreed with that. The next is, 456 00:26:30,359 --> 00:26:35,000 Speaker 1: the photos of the crabs resembling Samurai are fakes said 457 00:26:35,040 --> 00:26:38,320 Speaker 1: that that's obviously not true. There are tons of these high. 458 00:26:38,359 --> 00:26:42,560 Speaker 1: I mean, the photos don't look particularly faked. Maybe they're 459 00:26:43,000 --> 00:26:47,080 Speaker 1: thinking they're creations. I don't know. Maybe there maybe there 460 00:26:47,160 --> 00:26:50,320 Speaker 1: was some particular photo that was faked or enhanced. I've 461 00:26:50,359 --> 00:26:52,040 Speaker 1: never heard of this, But no, there are tons of 462 00:26:52,080 --> 00:26:56,159 Speaker 1: these photos and they're they're obviously not fakes. Uh. Next answer, 463 00:26:56,680 --> 00:26:59,400 Speaker 1: the shells have been carved to look that way. Six 464 00:26:59,480 --> 00:27:02,680 Speaker 1: percent of respondents said this. I think that's obviously not true. 465 00:27:03,600 --> 00:27:06,679 Speaker 1: The next one is it's just a coincidence. Thirty eight 466 00:27:06,720 --> 00:27:09,520 Speaker 1: percent said this, So it does look like a face, 467 00:27:09,640 --> 00:27:12,960 Speaker 1: but it's just a coincidence. There was no selection going 468 00:27:13,040 --> 00:27:16,639 Speaker 1: on for that. And then finally, the crabs really are 469 00:27:16,760 --> 00:27:21,080 Speaker 1: drowned Samurai warriors said this. I love how that that scored. 470 00:27:21,280 --> 00:27:23,920 Speaker 1: There was a higher score for that than for carved 471 00:27:24,760 --> 00:27:27,960 Speaker 1: crab shells, which, granted, I don't buy the car I 472 00:27:28,040 --> 00:27:31,280 Speaker 1: don't see the the argument for this being a carving, 473 00:27:31,720 --> 00:27:34,800 Speaker 1: But that makes far more logical sense than the idea 474 00:27:34,880 --> 00:27:37,720 Speaker 1: that these are actual ghosts. I don't know, I mean, 475 00:27:38,000 --> 00:27:41,400 Speaker 1: you know which is more likely. No, wait a minute, 476 00:27:41,400 --> 00:27:44,240 Speaker 1: I guess you're right. Maybe or maybe maybe people who 477 00:27:44,240 --> 00:27:45,760 Speaker 1: took the pot were just angry at the end of 478 00:27:45,800 --> 00:27:47,720 Speaker 1: it and they're like, I can't believe I wasted my 479 00:27:47,760 --> 00:27:48,960 Speaker 1: time in this. I'll tell you what, I'm going to 480 00:27:49,080 --> 00:27:52,520 Speaker 1: vote for the ghosts. Right, So, Dawkins writes, quote, I'm 481 00:27:52,560 --> 00:27:55,159 Speaker 1: afraid I voted with the kill joys. I think, on 482 00:27:55,359 --> 00:27:59,439 Speaker 1: balance that the resemblance is probably a coincidence. And Dawkins 483 00:27:59,560 --> 00:28:01,639 Speaker 1: cites some reasons for saying this. First of all, he 484 00:28:01,920 --> 00:28:05,240 Speaker 1: a couple that he cites as weaker minor reasons. First 485 00:28:05,240 --> 00:28:07,440 Speaker 1: of all, as we said, Martin pointed out in the 486 00:28:07,600 --> 00:28:10,600 Speaker 1: article we mentioned earlier, the face like ridges and grooves 487 00:28:11,000 --> 00:28:15,399 Speaker 1: on the crabs carapists actually correspond directly to underlying muscle 488 00:28:15,480 --> 00:28:19,600 Speaker 1: attachments Now, this wouldn't mean that they can't have been 489 00:28:19,760 --> 00:28:23,000 Speaker 1: sharpened by artificial selection, but it does show that they're 490 00:28:23,040 --> 00:28:26,760 Speaker 1: not merely meaningless decorations that serve no purpose of their 491 00:28:26,800 --> 00:28:29,640 Speaker 1: own and could be you know, selected for in any direction. 492 00:28:30,160 --> 00:28:33,440 Speaker 1: They're actually just a byproduct of a necessary part of 493 00:28:33,520 --> 00:28:36,879 Speaker 1: the crabs muscle anatomy. Right, And and we also have 494 00:28:36,960 --> 00:28:40,960 Speaker 1: to remember that there is nothing inherently holy or divine 495 00:28:41,000 --> 00:28:44,640 Speaker 1: about the human face. It is just it is just 496 00:28:45,440 --> 00:28:49,400 Speaker 1: This's just what our our frontal century array looks like. 497 00:28:49,840 --> 00:28:51,560 Speaker 1: You know, it's kind of an overstatement of the obvious, 498 00:28:51,600 --> 00:28:53,560 Speaker 1: but it's easy to miss that. I think that that 499 00:28:53,680 --> 00:28:56,400 Speaker 1: this is not the face of a primordial god who 500 00:28:56,480 --> 00:28:59,520 Speaker 1: then created people in his image. This is just what 501 00:29:00,200 --> 00:29:04,120 Speaker 1: our particular species of primate happens to have on the 502 00:29:04,200 --> 00:29:07,240 Speaker 1: front of its skull. Yeah, you need some light sensitive organs, 503 00:29:07,320 --> 00:29:09,680 Speaker 1: you've got two of them for depth perception, and then 504 00:29:09,720 --> 00:29:13,280 Speaker 1: you need something that can chew up stuff. It's achieving 505 00:29:13,320 --> 00:29:15,480 Speaker 1: one set of goals the back of a crab. This 506 00:29:15,600 --> 00:29:18,600 Speaker 1: crab is achieving another set of goals. And if those 507 00:29:18,720 --> 00:29:23,160 Speaker 1: solutions should look vaguely familiar or remind or remind you 508 00:29:23,280 --> 00:29:26,920 Speaker 1: of the other then Uh, then, yeah, that's that's coincidence. Well, 509 00:29:26,920 --> 00:29:28,600 Speaker 1: I think one of the things you're pointing out here 510 00:29:28,960 --> 00:29:32,840 Speaker 1: is that the things that q to us as faces 511 00:29:33,440 --> 00:29:36,360 Speaker 1: can be incredibly simple and don't have to depart from 512 00:29:36,480 --> 00:29:39,520 Speaker 1: randomness all that much. Like two dots in a line 513 00:29:39,880 --> 00:29:44,160 Speaker 1: queues us as face. Right, Yeah, it's we We create 514 00:29:44,240 --> 00:29:46,840 Speaker 1: faces all the time and we see them in everything. Yeah, 515 00:29:46,880 --> 00:29:48,320 Speaker 1: and this will be a big point we'll come back 516 00:29:48,360 --> 00:29:51,520 Speaker 1: to in just a minute. The next point that Dawkins 517 00:29:51,640 --> 00:29:54,560 Speaker 1: makes is that the Hackia crabs are too small to 518 00:29:54,720 --> 00:29:57,400 Speaker 1: keep right. This is another thing that Martin was sort 519 00:29:57,400 --> 00:29:59,640 Speaker 1: of alluding to. They're too small to keep in. Crab 520 00:29:59,720 --> 00:30:03,120 Speaker 1: catch would simply throw them back, regardless of what designs 521 00:30:03,160 --> 00:30:05,920 Speaker 1: they had on their backs, simply because they don't have 522 00:30:06,160 --> 00:30:08,720 Speaker 1: enough meat. So, first of all, I was thinking, Okay, 523 00:30:08,880 --> 00:30:11,720 Speaker 1: is this true. We don't know how big they get. 524 00:30:11,920 --> 00:30:14,520 Speaker 1: But there was a photo that Martin included with his 525 00:30:14,680 --> 00:30:18,080 Speaker 1: article from nineteen three. It's of a male specimen caught 526 00:30:18,160 --> 00:30:22,040 Speaker 1: in Ariaki Bay off Kyushu in Japan in nineteen sixty eight, 527 00:30:22,400 --> 00:30:25,040 Speaker 1: and you can definitely see the samurai face. It looks 528 00:30:25,160 --> 00:30:27,920 Speaker 1: like a samurai, But how big is it? The total 529 00:30:28,000 --> 00:30:30,880 Speaker 1: width of the crabs back is at the widest point 530 00:30:31,640 --> 00:30:35,120 Speaker 1: twenty point four millimeters or zero point eight inches. Now, 531 00:30:35,240 --> 00:30:37,840 Speaker 1: that's less wide than you mentioned earlier. It sounds like 532 00:30:37,880 --> 00:30:39,520 Speaker 1: it could get up to a little over an inch 533 00:30:39,640 --> 00:30:43,160 Speaker 1: or about thirty millimeters. Uh, that's that's not a very 534 00:30:43,240 --> 00:30:47,560 Speaker 1: big crab. I was like trying to imagine, like cracking 535 00:30:47,640 --> 00:30:49,520 Speaker 1: the shell to get the meat out of a crab 536 00:30:49,680 --> 00:30:52,760 Speaker 1: that's about an inch wide. Yeah, even a fair sized crab. 537 00:30:53,040 --> 00:30:55,640 Speaker 1: If you're if you're, if you're, you're cracking open enough 538 00:30:55,680 --> 00:30:57,920 Speaker 1: of them, it begins to feel like an awful lot 539 00:30:58,000 --> 00:31:01,040 Speaker 1: of work for the meatia return that you're getting. Yeah, 540 00:31:01,160 --> 00:31:03,560 Speaker 1: So would they keep a crab like that? I'm not sure? 541 00:31:03,640 --> 00:31:05,840 Speaker 1: It seems pretty small. Then again, I can't pretend to 542 00:31:05,960 --> 00:31:09,960 Speaker 1: know the fishing practices of historical Japan. Well I I 543 00:31:10,120 --> 00:31:13,440 Speaker 1: can't either, but I do have an illuminating fact on 544 00:31:13,640 --> 00:31:17,200 Speaker 1: just how inedible a small crab can be. Uh. In 545 00:31:17,320 --> 00:31:18,920 Speaker 1: this case, we need to consider the plight of the 546 00:31:19,040 --> 00:31:21,640 Speaker 1: green crab. Take me to the green crab, all right. 547 00:31:21,680 --> 00:31:24,280 Speaker 1: This is a native of the Northeast Atlantic Ocean in 548 00:31:24,400 --> 00:31:28,200 Speaker 1: Baltic Sea, but it's an invasive species everywhere else. Including 549 00:31:28,280 --> 00:31:32,760 Speaker 1: New England. So these guys are roughly ninety millimeters or 550 00:31:32,920 --> 00:31:36,040 Speaker 1: three point five inches in size. And while it's tempting 551 00:31:36,120 --> 00:31:38,480 Speaker 1: to say, well, let's just eat these things, they're the enemy, 552 00:31:38,480 --> 00:31:40,440 Speaker 1: they're invasive, let's just eat them up in the same 553 00:31:40,480 --> 00:31:44,920 Speaker 1: way that we've, for instance, promoted the consumption um of 554 00:31:45,080 --> 00:31:48,960 Speaker 1: lion fish, which are also invasive in many areas. But 555 00:31:49,040 --> 00:31:51,720 Speaker 1: they're simply too small to get any meat off of 556 00:31:51,920 --> 00:31:56,320 Speaker 1: through traditional methods. But since there's a I'm just imagining 557 00:31:56,360 --> 00:31:58,520 Speaker 1: it like a scene in a comedy movie where somebody 558 00:31:58,560 --> 00:32:01,680 Speaker 1: brings you tiny crabs and the little cracker things and 559 00:32:02,160 --> 00:32:04,600 Speaker 1: you're working the nutcracker on something you can barely keep 560 00:32:04,640 --> 00:32:06,440 Speaker 1: in your fingers. Yeah, you you would like have to 561 00:32:06,520 --> 00:32:09,320 Speaker 1: use tweezers or something, right, But of course, since there's 562 00:32:09,360 --> 00:32:12,160 Speaker 1: a reason to wage hungry war on the green crabs, 563 00:32:12,200 --> 00:32:16,560 Speaker 1: some chefs have started turning them into stock. So that's 564 00:32:16,640 --> 00:32:19,480 Speaker 1: that's one potential approach there. And then there's also a 565 00:32:19,600 --> 00:32:24,960 Speaker 1: Canadian startup called can Chine that has experimented with using 566 00:32:25,000 --> 00:32:30,040 Speaker 1: a prototype machine to suck the meat out green crafts 567 00:32:30,760 --> 00:32:33,720 Speaker 1: h now an industrial meat production that's always the best 568 00:32:33,760 --> 00:32:36,440 Speaker 1: thing to learn the details of that industrial is key, 569 00:32:36,520 --> 00:32:40,720 Speaker 1: like we're talking about modern advancements that would be necessary 570 00:32:40,960 --> 00:32:43,120 Speaker 1: like that, this was this is all these details from 571 00:32:43,160 --> 00:32:46,240 Speaker 1: a two thousand fifteen article Green crabs are multiplying? Should 572 00:32:46,280 --> 00:32:48,920 Speaker 1: We Eat Them? By Roger Warner for the Boston Globe. 573 00:32:49,560 --> 00:32:51,280 Speaker 1: But you know, and it's been a few years, but 574 00:32:51,360 --> 00:32:54,320 Speaker 1: it still paints a picture of our ability to consume 575 00:32:54,400 --> 00:32:57,800 Speaker 1: these small crabs still depends on technology that we haven't 576 00:32:57,880 --> 00:33:01,200 Speaker 1: quite yet developed. He points out in another another solution 577 00:33:01,240 --> 00:33:04,080 Speaker 1: here would be to catch the crabs molting, essentially have 578 00:33:04,320 --> 00:33:07,440 Speaker 1: soft shell green crab that you could indeed fry up 579 00:33:07,520 --> 00:33:09,680 Speaker 1: in the same way that you fry up a soft 580 00:33:09,720 --> 00:33:12,440 Speaker 1: sheld crab. This, of course the molten face. You'd have 581 00:33:12,520 --> 00:33:14,120 Speaker 1: to catch them in the multip You have to catch 582 00:33:14,160 --> 00:33:17,280 Speaker 1: them at just the right moment and u as of 583 00:33:17,360 --> 00:33:20,280 Speaker 1: two thousand fifteen, they were only experiencing a fifty six 584 00:33:20,360 --> 00:33:23,800 Speaker 1: to sixty one percent success rate, and Warner says that 585 00:33:23,880 --> 00:33:26,800 Speaker 1: we would definitely have to improve that success rate before 586 00:33:27,120 --> 00:33:30,400 Speaker 1: this would be a like a feasible source of crab meat. 587 00:33:30,600 --> 00:33:32,600 Speaker 1: You know, the point you made that's actually sticking with 588 00:33:32,760 --> 00:33:34,680 Speaker 1: me the most is just the idea of using them 589 00:33:34,720 --> 00:33:36,720 Speaker 1: for stock I don't know why I didn't even think that, 590 00:33:36,920 --> 00:33:39,000 Speaker 1: like you, you don't necessarily have to be able to 591 00:33:39,080 --> 00:33:41,880 Speaker 1: get the meat out of it for it to provide 592 00:33:41,920 --> 00:33:44,200 Speaker 1: some kind of culinary usage. I mean, people could use 593 00:33:44,280 --> 00:33:46,480 Speaker 1: a in the same way that people use a whole 594 00:33:46,520 --> 00:33:50,320 Speaker 1: bunch of seafood products that are not really themselves edible 595 00:33:50,400 --> 00:33:52,960 Speaker 1: to create stock, like bones and stuff like that. You 596 00:33:53,360 --> 00:33:55,160 Speaker 1: make the stock, you strain them out. You could put 597 00:33:55,200 --> 00:33:58,280 Speaker 1: a bunch of tiny crabs in a pot, make some stock, 598 00:33:58,360 --> 00:34:00,440 Speaker 1: and then strain them out. I assume, like if they 599 00:34:00,480 --> 00:34:02,920 Speaker 1: didn't have some kind of bad taste or or mess 600 00:34:03,000 --> 00:34:05,720 Speaker 1: up the water somehow. Yeah, but still with the green crab, 601 00:34:05,960 --> 00:34:09,240 Speaker 1: it seems that this is a case where certain chefs 602 00:34:09,320 --> 00:34:12,400 Speaker 1: who are trying to solve the problem that are they're saying, hey, 603 00:34:12,520 --> 00:34:14,480 Speaker 1: what can we do with this invasive creature? They have 604 00:34:14,640 --> 00:34:17,719 Speaker 1: turned to making stock out of them, and it's supposedly delicious. 605 00:34:18,200 --> 00:34:20,960 Speaker 1: But I suppose it is not a great reason in 606 00:34:21,040 --> 00:34:26,200 Speaker 1: and of itself, certainly for Japanese fisher folk of yr 607 00:34:26,480 --> 00:34:28,799 Speaker 1: to go out there and catch them. Then again, I've 608 00:34:28,840 --> 00:34:31,359 Speaker 1: got to come back at you. I was wondering how 609 00:34:31,520 --> 00:34:35,239 Speaker 1: small of a crab people would normally eat in Japan. First, 610 00:34:35,360 --> 00:34:38,040 Speaker 1: I actually did try to look up fishing practices of 611 00:34:38,120 --> 00:34:41,040 Speaker 1: medieval Japan, and I couldn't find any details about anything, 612 00:34:41,840 --> 00:34:44,040 Speaker 1: or at least nothing about how small of a crab 613 00:34:44,120 --> 00:34:47,320 Speaker 1: people would keep. But I did find a Japan Times 614 00:34:47,560 --> 00:34:50,279 Speaker 1: Food and Drink article from two thousand two called in 615 00:34:50,360 --> 00:34:54,840 Speaker 1: a pinch, these will do just fine by Rick la point. 616 00:34:55,600 --> 00:34:59,440 Speaker 1: It's about the culinary uses of fresh water crab species 617 00:34:59,560 --> 00:35:03,480 Speaker 1: called sawa ghani meaning marsh crab or river crab, and 618 00:35:03,640 --> 00:35:07,360 Speaker 1: the mokuzu ghani or the mitten crab. Now Sawaghani in 619 00:35:07,440 --> 00:35:11,760 Speaker 1: particular is tiny, barely three centimeters long as an adult, 620 00:35:12,120 --> 00:35:15,200 Speaker 1: and the point rights quote sawaghani ranging color from deep 621 00:35:15,280 --> 00:35:18,480 Speaker 1: purple to blue to bright crimson. They are a treat 622 00:35:18,560 --> 00:35:22,279 Speaker 1: all summer long, usually available from late May. Not often 623 00:35:22,360 --> 00:35:25,440 Speaker 1: seen in local supermarkets, sawa ghani are sold in larger 624 00:35:25,520 --> 00:35:29,480 Speaker 1: retail food markets and at any good fish purveyor. As 625 00:35:29,560 --> 00:35:33,960 Speaker 1: with makuzu ghani, sawaghani must be cooked thoroughly before being served. 626 00:35:34,320 --> 00:35:36,840 Speaker 1: These little crab are eaten whole as a rule, and 627 00:35:36,960 --> 00:35:39,920 Speaker 1: are usually fried briefly, so the crisp shell and all 628 00:35:40,000 --> 00:35:43,200 Speaker 1: the legs may be eaten so so they fry, they 629 00:35:43,239 --> 00:35:45,400 Speaker 1: fry them or braise them, eat them whole, eat the 630 00:35:45,440 --> 00:35:48,160 Speaker 1: whole shell. Oh wow, so they're just they're small enough 631 00:35:48,200 --> 00:35:50,600 Speaker 1: to wear their shell is just not that thick. Or 632 00:35:50,719 --> 00:35:53,400 Speaker 1: it's kind of because normally only hear this with with 633 00:35:53,520 --> 00:35:56,880 Speaker 1: soft shell crab where the shell the new shell has 634 00:35:56,960 --> 00:35:59,840 Speaker 1: not yet developed. And so this article ends with a 635 00:36:00,080 --> 00:36:05,120 Speaker 1: recipe actually for brays sweet, sweet and salty sabaghani with sake, soy, sauce, sugar, 636 00:36:05,160 --> 00:36:07,239 Speaker 1: and chili powder. It sounds kind of good, it does. 637 00:36:07,360 --> 00:36:10,680 Speaker 1: I'm I'm suddenly hungry for crab. Now. I don't know 638 00:36:10,800 --> 00:36:12,880 Speaker 1: if it's possible to eat hay ka ghani in the 639 00:36:12,960 --> 00:36:16,200 Speaker 1: same way the samurai crab. Maybe the taste or the 640 00:36:16,280 --> 00:36:18,120 Speaker 1: texture would be different in a way that would make 641 00:36:18,160 --> 00:36:21,080 Speaker 1: this impossible. Maybe the shells too hard or something. I 642 00:36:21,280 --> 00:36:24,480 Speaker 1: looked all over the place for recipes or similar stories 643 00:36:25,000 --> 00:36:28,200 Speaker 1: featuring Hayka gani, and I couldn't find anything. There were 644 00:36:28,239 --> 00:36:31,560 Speaker 1: no results I could find for hayka ghani recipes or 645 00:36:31,719 --> 00:36:35,359 Speaker 1: ways of preparing them culinary traditions. So I guess it's 646 00:36:35,400 --> 00:36:38,279 Speaker 1: possible that this could be for cultural reasons, rather than 647 00:36:38,320 --> 00:36:41,040 Speaker 1: there being some problem with their bodies making them inedible. 648 00:36:41,520 --> 00:36:44,160 Speaker 1: But I found nothing. All right, let's take another break, 649 00:36:44,160 --> 00:36:46,920 Speaker 1: and when we come back we will continue to explore 650 00:36:47,800 --> 00:36:54,400 Speaker 1: the the mystery of the samurai crab. Alright, we're back, Okay, 651 00:36:54,480 --> 00:36:57,360 Speaker 1: so we finally are going to get to what dawkinsites 652 00:36:57,400 --> 00:37:01,479 Speaker 1: as his main reason for rejecting the Huxley Sagan theory. 653 00:37:01,560 --> 00:37:05,600 Speaker 1: Are you ready, Robert Dawkins writes, quote, My main reason 654 00:37:05,719 --> 00:37:08,840 Speaker 1: for skepticism about the Huxley Sagan theory is that the 655 00:37:09,000 --> 00:37:13,400 Speaker 1: human brain is demonstrably eager to see faces in random patterns, 656 00:37:13,880 --> 00:37:16,520 Speaker 1: as we know from scientific evidence. On top of the 657 00:37:16,600 --> 00:37:19,840 Speaker 1: numerous legends about the faces of Jesus or the Virgin 658 00:37:19,960 --> 00:37:23,680 Speaker 1: Mary or Mother Teresa being seen on slices of toast, 659 00:37:23,880 --> 00:37:27,440 Speaker 1: or pizzas, or patches of damp on a wall. This 660 00:37:27,640 --> 00:37:31,040 Speaker 1: eagerness is enhanced if the pattern departs from randomness in 661 00:37:31,120 --> 00:37:35,840 Speaker 1: the specific direction of being symmetrical. All crabs except hermit 662 00:37:35,920 --> 00:37:40,799 Speaker 1: crabs are symmetrical anyway, I reluctantly suspect that the resemblance 663 00:37:40,840 --> 00:37:43,759 Speaker 1: of Hakia to a Samurai warrior is no more than 664 00:37:43,840 --> 00:37:46,440 Speaker 1: an accident, much as I would like to believe that 665 00:37:46,600 --> 00:37:50,680 Speaker 1: it has been enhanced by natural selection. This phenomenon that 666 00:37:50,920 --> 00:37:55,080 Speaker 1: Dawkins is talking about is called paradolia, and it is 667 00:37:55,200 --> 00:38:01,360 Speaker 1: the tendency that humans have to see information in random noise. 668 00:38:01,880 --> 00:38:04,360 Speaker 1: So when you see a face in the side of 669 00:38:04,400 --> 00:38:07,560 Speaker 1: a tree, or you see the shape of an animal 670 00:38:07,719 --> 00:38:09,960 Speaker 1: in the clouds or anything like that, things that are 671 00:38:10,040 --> 00:38:13,560 Speaker 1: actually just random patterns in nature and have no top 672 00:38:13,640 --> 00:38:16,880 Speaker 1: down control or no information encoded to them still read 673 00:38:17,040 --> 00:38:20,920 Speaker 1: as information to us. Yeah. An example of this, too, 674 00:38:21,000 --> 00:38:23,320 Speaker 1: of the animal realm goes back to our recent Animal 675 00:38:23,400 --> 00:38:26,439 Speaker 1: Lives episode where we talked about the death the death 676 00:38:26,480 --> 00:38:30,239 Speaker 1: head moth that said hawk moth, where we just can't 677 00:38:30,280 --> 00:38:33,320 Speaker 1: get over this skull on its back but there's not 678 00:38:33,520 --> 00:38:35,320 Speaker 1: really there's not really there aren't really a lot of 679 00:38:35,360 --> 00:38:38,080 Speaker 1: great arguments as do why it is there. Yeah, So 680 00:38:38,239 --> 00:38:41,960 Speaker 1: para idolia would be the theory that says, okay, there 681 00:38:42,120 --> 00:38:44,359 Speaker 1: is no it's not actually a skull on its back 682 00:38:44,480 --> 00:38:46,840 Speaker 1: hasn't been selected to look like a skull in in 683 00:38:46,920 --> 00:38:50,080 Speaker 1: any way. We're just reading information that's not really there 684 00:38:50,520 --> 00:38:53,040 Speaker 1: because we're primed to look for that kind of stuff 685 00:38:53,760 --> 00:38:57,640 Speaker 1: and obsessed about it. And so dawkins argument here is 686 00:38:57,719 --> 00:39:01,440 Speaker 1: essentially that para idolia is so wrong that the departure 687 00:39:01,560 --> 00:39:06,959 Speaker 1: from randomness need not be especially unlikely before we start 688 00:39:07,040 --> 00:39:09,640 Speaker 1: seeing faces in it. I want to phrase the argument 689 00:39:09,719 --> 00:39:12,200 Speaker 1: another way to try to make it more more specific 690 00:39:12,320 --> 00:39:16,799 Speaker 1: and measurable. Imagine two different scenarios. Scenario one, if these 691 00:39:16,920 --> 00:39:21,560 Speaker 1: crabs were being born with a nearly one photo realistic 692 00:39:21,680 --> 00:39:25,480 Speaker 1: image of Toshiro Mifune's character from yo Jimbo on their backs. 693 00:39:25,920 --> 00:39:27,879 Speaker 1: Try to imagine that. Right, if you pull a crab 694 00:39:27,960 --> 00:39:30,200 Speaker 1: out of the ocean and it has a photo real 695 00:39:30,800 --> 00:39:33,320 Speaker 1: copy of a samurai face on it, that would be 696 00:39:33,480 --> 00:39:36,680 Speaker 1: so unlikely to happen naturally or by coincidence. You would 697 00:39:36,719 --> 00:39:40,799 Speaker 1: have to invoke some kind of special, narrow type of selection, right, 698 00:39:40,840 --> 00:39:43,640 Speaker 1: Like you'd have to say, okay, somebody three D printed 699 00:39:43,680 --> 00:39:45,920 Speaker 1: this crab carapist and put it back in the ocean, 700 00:39:46,120 --> 00:39:49,040 Speaker 1: or there's some kind of crazy genetic engineering of crabs 701 00:39:49,120 --> 00:39:52,120 Speaker 1: going on. It has to be artificial. And the reason 702 00:39:52,160 --> 00:39:54,200 Speaker 1: it has to be artificial is that it is such 703 00:39:54,360 --> 00:39:58,320 Speaker 1: a strong departure from randomness. Right, There's no way a 704 00:39:58,400 --> 00:40:01,759 Speaker 1: photo realistic image like that had happen by chance. Right, 705 00:40:01,840 --> 00:40:03,600 Speaker 1: it must be the work of the gods or the 706 00:40:03,719 --> 00:40:07,239 Speaker 1: humans that they're both big samurai film buffs. So you know, 707 00:40:08,840 --> 00:40:12,840 Speaker 1: another scenario, if a crab just had two dots positioned 708 00:40:13,120 --> 00:40:16,440 Speaker 1: above a curved line, making a crude approximation of like 709 00:40:16,520 --> 00:40:19,800 Speaker 1: a stick figure smiley face, you would not think that 710 00:40:19,920 --> 00:40:21,880 Speaker 1: this needed to be selected for, right, it would be. 711 00:40:22,000 --> 00:40:25,960 Speaker 1: So it's it's so close to random that you wouldn't 712 00:40:26,000 --> 00:40:28,960 Speaker 1: need to invoke any special selection to explain it. Now, 713 00:40:29,040 --> 00:40:32,960 Speaker 1: we're obviously with the Hey Kagani crab, we're somewhere between 714 00:40:33,200 --> 00:40:36,480 Speaker 1: those two scenarios. It's not a photo realistic image of 715 00:40:36,600 --> 00:40:39,759 Speaker 1: famous samurai character, but it's also not just two dots 716 00:40:39,840 --> 00:40:42,080 Speaker 1: with a line or a smiley face. And so the 717 00:40:42,200 --> 00:40:45,600 Speaker 1: question is which of the scenarios is it closer to. 718 00:40:45,920 --> 00:40:48,600 Speaker 1: Is it closer to randomness than we're giving it credit for, 719 00:40:49,120 --> 00:40:52,719 Speaker 1: or is it closer to a real departure from randomness 720 00:40:52,760 --> 00:40:55,600 Speaker 1: than we're giving it credit for. Interesting this, right, of course, 721 00:40:55,640 --> 00:40:59,319 Speaker 1: reminds me of a various conspiracy theories that are out there, 722 00:40:59,400 --> 00:41:01,600 Speaker 1: you know, like it falls into this area where if 723 00:41:01,680 --> 00:41:04,640 Speaker 1: you if you squint, or if you just you turn 724 00:41:04,719 --> 00:41:09,160 Speaker 1: off certain logicum toggles in your brain, then it then 725 00:41:09,200 --> 00:41:11,600 Speaker 1: it can begin to make a perfect kind of sense. 726 00:41:12,040 --> 00:41:15,000 Speaker 1: You know, but there's something about the the ambiguity of 727 00:41:15,120 --> 00:41:17,840 Speaker 1: it that gives it power. Yeah. Now, of course, in 728 00:41:17,960 --> 00:41:20,800 Speaker 1: dawkins argument, we'd have to notice that in both of 729 00:41:20,920 --> 00:41:23,920 Speaker 1: these scenarios I just mentioned, the crude smiley face or 730 00:41:23,960 --> 00:41:27,000 Speaker 1: the photorealistic image, in both of them we see a face. 731 00:41:27,600 --> 00:41:31,120 Speaker 1: So we're simply wired to see faces in random designs. 732 00:41:31,480 --> 00:41:34,239 Speaker 1: And so Dawkins thinks that the crabs carapaces closer to 733 00:41:34,360 --> 00:41:37,920 Speaker 1: scenario to the almost random smiley face than it is 734 00:41:38,000 --> 00:41:41,480 Speaker 1: to Scenario one, the photorealistic face. It's not actually all 735 00:41:41,560 --> 00:41:43,879 Speaker 1: that strong a departure from randomness, and yet we see 736 00:41:43,920 --> 00:41:46,480 Speaker 1: the face anyway, because that's what we do, it's what 737 00:41:46,600 --> 00:41:51,319 Speaker 1: we're wired for. But then again, remember Huxley's claim. Huxley said, specifically, 738 00:41:51,560 --> 00:41:54,520 Speaker 1: the resemblance of Dora Pay to an angry Japanese warrior 739 00:41:54,680 --> 00:41:59,080 Speaker 1: is far too specific and too detailed to be accidental. 740 00:41:59,200 --> 00:42:02,120 Speaker 1: So we've got we've at Dawkins and Huxley at odds here. 741 00:42:02,560 --> 00:42:05,600 Speaker 1: Huxley says it's too specific to be a coincidence. Dawkins 742 00:42:05,640 --> 00:42:08,840 Speaker 1: says it's probably a coincidence, and we're just over interpreting it. 743 00:42:09,320 --> 00:42:11,719 Speaker 1: How do we know who's right here. Well, certainly we can. 744 00:42:12,120 --> 00:42:13,680 Speaker 1: We can go back to some of the other facts 745 00:42:13,719 --> 00:42:17,120 Speaker 1: we've talked about, sort of the time frame, the brief 746 00:42:17,600 --> 00:42:21,359 Speaker 1: period in which samurai art is a thing or even 747 00:42:21,440 --> 00:42:24,240 Speaker 1: human faces or a thing versus the larger time scale 748 00:42:24,600 --> 00:42:27,279 Speaker 1: of crab evolution. But then also we can look to 749 00:42:27,440 --> 00:42:32,360 Speaker 1: this particular to our particular propensity to see faces and 750 00:42:32,520 --> 00:42:34,960 Speaker 1: things like how strong is this effect exactly? So we 751 00:42:35,040 --> 00:42:36,879 Speaker 1: can come at it from both angles. We can look 752 00:42:36,920 --> 00:42:39,320 Speaker 1: at what's the chance of the crab would look like 753 00:42:39,440 --> 00:42:42,440 Speaker 1: that anyway from biological perspective, and we can look at 754 00:42:42,600 --> 00:42:45,319 Speaker 1: what's the chance that humans would see faces and things 755 00:42:45,400 --> 00:42:48,239 Speaker 1: that really don't have hardly a face at all on them. 756 00:42:48,800 --> 00:42:50,600 Speaker 1: And so let's look at the letter. Let's look at 757 00:42:50,640 --> 00:42:54,239 Speaker 1: this idea of paraidolia. How strong and prevalent is the 758 00:42:54,280 --> 00:42:57,359 Speaker 1: paraidolia effect. I want to consult a few studies. There 759 00:42:57,360 --> 00:43:00,880 Speaker 1: are some that don't quite fee it because they've got 760 00:43:01,080 --> 00:43:04,480 Speaker 1: odd methodology, But a lot of the paraidolia studies will 761 00:43:04,560 --> 00:43:07,160 Speaker 1: work like this, like you've got a image on a 762 00:43:07,239 --> 00:43:11,719 Speaker 1: screen that has a that has pure noise on it, 763 00:43:11,840 --> 00:43:15,799 Speaker 1: just like random snow static or randomly generated static by 764 00:43:15,880 --> 00:43:19,480 Speaker 1: some algorithm, and you ask people do you see a 765 00:43:19,640 --> 00:43:22,680 Speaker 1: letter in the encoded in the static or do you 766 00:43:22,800 --> 00:43:26,000 Speaker 1: see a face? And sometimes the people who are doing 767 00:43:26,040 --> 00:43:28,320 Speaker 1: these experiments will prime you. In fact, in all the 768 00:43:28,440 --> 00:43:31,879 Speaker 1: examples I could find, they were priming people saying, if 769 00:43:31,920 --> 00:43:34,719 Speaker 1: you see a face in these pictures, tell us when 770 00:43:34,760 --> 00:43:36,520 Speaker 1: you see a face, or tell us what kind of 771 00:43:36,600 --> 00:43:39,720 Speaker 1: face you see. Like one of the studies had faces 772 00:43:39,840 --> 00:43:43,160 Speaker 1: encoded in the in the static, but the faces didn't 773 00:43:43,200 --> 00:43:45,480 Speaker 1: have any mouths, and they were asking people do you 774 00:43:45,480 --> 00:43:49,520 Speaker 1: see a smiling face or a not smiling face. So 775 00:43:49,680 --> 00:43:52,560 Speaker 1: one story, for example, was by Corey Reeth at All 776 00:43:52,880 --> 00:43:56,040 Speaker 1: in Perception in two thousand eleven called Faces in the 777 00:43:56,120 --> 00:43:59,880 Speaker 1: Mist Illusory Face and Letter Detection. This had hundreds of 778 00:44:00,000 --> 00:44:02,719 Speaker 1: participants and the study looked at, among other things, what 779 00:44:03,000 --> 00:44:07,360 Speaker 1: features of random noise images tended to suggest faces and letters. 780 00:44:07,880 --> 00:44:10,200 Speaker 1: And in this study, after a training period with different 781 00:44:10,239 --> 00:44:13,000 Speaker 1: types of images, participants were asked to look at whether 782 00:44:13,160 --> 00:44:16,319 Speaker 1: images had letters or faces embedded in them. And there 783 00:44:16,360 --> 00:44:19,919 Speaker 1: were three experiments with pure noise images and participants thought 784 00:44:20,000 --> 00:44:22,719 Speaker 1: that there were letters embedded in thirty six percent of 785 00:44:22,760 --> 00:44:25,360 Speaker 1: the images when they were suggested that was a possibility 786 00:44:25,719 --> 00:44:29,280 Speaker 1: and participants thought there were faces embedded in between thirty 787 00:44:29,320 --> 00:44:32,720 Speaker 1: two and thirty six percent of pure noise images, depending 788 00:44:32,760 --> 00:44:34,480 Speaker 1: on whether or not there was an oval in the 789 00:44:34,560 --> 00:44:37,200 Speaker 1: middle of the image bounding where the face was supposed 790 00:44:37,239 --> 00:44:40,240 Speaker 1: to appear. So it looks like there you're showing people 791 00:44:40,320 --> 00:44:43,200 Speaker 1: pure noise. There's nothing encoded in it, and at least 792 00:44:43,280 --> 00:44:46,040 Speaker 1: thirty two percent of the time if there's a suggestion 793 00:44:46,160 --> 00:44:48,279 Speaker 1: that there could be a face, people think they see 794 00:44:48,280 --> 00:44:53,120 Speaker 1: a face. Another study from fourteen by jianng Liu at 795 00:44:53,160 --> 00:44:57,160 Speaker 1: All called Seeing Jesus and Toast Neural and Behavioral Correlates 796 00:44:57,200 --> 00:45:00,120 Speaker 1: of Face paara idolia. The purpose of the study is 797 00:45:00,160 --> 00:45:04,439 Speaker 1: to quote explore face specific behavioral and neural responses during 798 00:45:04,480 --> 00:45:08,000 Speaker 1: illusory face processing. In other words, they were trying to see, Okay, 799 00:45:08,080 --> 00:45:10,759 Speaker 1: we know people sometimes see faces that aren't there. What's 800 00:45:10,760 --> 00:45:13,879 Speaker 1: happening in their brains when they see faces that aren't there? 801 00:45:14,480 --> 00:45:17,400 Speaker 1: And so the participants were twenty healthy Chinese adults and 802 00:45:17,440 --> 00:45:20,400 Speaker 1: they were showing images composed of pure noise like random 803 00:45:20,480 --> 00:45:23,840 Speaker 1: gray scale pattern ngs. The researchers led them to believe 804 00:45:24,040 --> 00:45:26,799 Speaker 1: that fifty percent of the pure noise images they were 805 00:45:26,840 --> 00:45:30,840 Speaker 1: seeing contained either images of letters or of faces, and 806 00:45:31,000 --> 00:45:34,440 Speaker 1: under these conditions, looking at pure randomness but being told 807 00:45:34,520 --> 00:45:37,960 Speaker 1: it might contain a face, participants said they saw letters 808 00:45:38,200 --> 00:45:40,560 Speaker 1: in thirty eight percent of the images and faces in 809 00:45:40,680 --> 00:45:42,880 Speaker 1: thirty four percent of the images. So that's really close 810 00:45:42,920 --> 00:45:45,320 Speaker 1: to the figures in the last study, right, It's like 811 00:45:45,480 --> 00:45:48,400 Speaker 1: thirty something percent of the time if you're told a 812 00:45:48,520 --> 00:45:51,359 Speaker 1: face might be there and there's nothing there, you will 813 00:45:51,400 --> 00:45:54,160 Speaker 1: see a face anyway. There's a lot of interesting stuff 814 00:45:54,200 --> 00:45:56,640 Speaker 1: explored in the research apart from just whether we detect 815 00:45:56,719 --> 00:45:58,520 Speaker 1: faces and randomness, and I think it might be worth 816 00:45:58,600 --> 00:46:00,919 Speaker 1: coming back to do a whole episode on the neuroscience 817 00:46:00,960 --> 00:46:03,839 Speaker 1: of paraidolia in the future. In the past, I've thought 818 00:46:03,880 --> 00:46:07,000 Speaker 1: about this in terms of, say, staring into a dark wood, 819 00:46:07,640 --> 00:46:10,239 Speaker 1: you know, where the one thing you don't want to 820 00:46:10,280 --> 00:46:12,760 Speaker 1: see is a creepy witch face or troll face staring 821 00:46:12,800 --> 00:46:14,279 Speaker 1: out at you from the dark. What if you do 822 00:46:14,360 --> 00:46:17,120 Speaker 1: want to see that, well, then my advice is to 823 00:46:17,239 --> 00:46:21,040 Speaker 1: keep staring, because I'll often have that effect where I'm 824 00:46:21,120 --> 00:46:23,080 Speaker 1: staring into the into the woods. I mean not often. 825 00:46:23,160 --> 00:46:25,240 Speaker 1: I don't go out every night and staring into the woods, 826 00:46:25,280 --> 00:46:27,799 Speaker 1: but there are times when I've I've done that where 827 00:46:27,800 --> 00:46:29,560 Speaker 1: I'm staring into the woods sort of checking it out, 828 00:46:29,800 --> 00:46:31,399 Speaker 1: and I'll think, what if I see a witch face, 829 00:46:31,719 --> 00:46:35,160 Speaker 1: and then I'll I'll know intrinsically, if I keep looking, 830 00:46:35,520 --> 00:46:37,880 Speaker 1: I'm going to see something that I could interpret as 831 00:46:37,880 --> 00:46:40,920 Speaker 1: a witch face, and it's going to spiral out of control. 832 00:46:41,000 --> 00:46:44,160 Speaker 1: I need to stop staring into the into the darkness 833 00:46:44,239 --> 00:46:48,320 Speaker 1: of the woods. Yeah, yeah, Yeah. There's an interesting theory 834 00:46:48,880 --> 00:46:51,080 Speaker 1: that a lot of these Paraidolia studies are based on, 835 00:46:51,600 --> 00:46:55,520 Speaker 1: and it's the idea of studying a black box through 836 00:46:56,080 --> 00:46:59,439 Speaker 1: random noise, using noise to study what's inside a black box. 837 00:46:59,480 --> 00:47:02,280 Speaker 1: So if there's some like a brain or a computer 838 00:47:02,440 --> 00:47:05,719 Speaker 1: that we don't understand the programming of and you want 839 00:47:05,760 --> 00:47:08,520 Speaker 1: to understand how it works, you can't like get inside 840 00:47:08,560 --> 00:47:10,520 Speaker 1: it and cut it up and understand how it works. 841 00:47:11,040 --> 00:47:15,239 Speaker 1: But what you could do is that you can stimulate 842 00:47:15,360 --> 00:47:19,759 Speaker 1: it with nothing and see what it generates on its own, 843 00:47:19,880 --> 00:47:23,440 Speaker 1: to sort of like understand what the base level algorithm generating, 844 00:47:23,719 --> 00:47:26,600 Speaker 1: what the base level algorithms are, what they generate when 845 00:47:26,640 --> 00:47:29,719 Speaker 1: there's no real input. So one example of using of 846 00:47:29,800 --> 00:47:31,840 Speaker 1: studying the human mind like this would be the sensory 847 00:47:31,880 --> 00:47:35,080 Speaker 1: deprivation tank you put a human in a sensory deprivation 848 00:47:35,160 --> 00:47:37,719 Speaker 1: tank to see where the mind goes when there's no 849 00:47:38,040 --> 00:47:42,520 Speaker 1: input to base output on. Because we have evolved to 850 00:47:42,640 --> 00:47:47,160 Speaker 1: thrive in a world of of of stimuli, of changing stimuli, 851 00:47:47,239 --> 00:47:49,560 Speaker 1: and if you take that out of the equation, then 852 00:47:50,239 --> 00:47:53,400 Speaker 1: all of our sensory feelers are just pawing around it nothing, 853 00:47:53,960 --> 00:47:57,560 Speaker 1: but they're going to they can still interpret a form 854 00:47:57,960 --> 00:48:00,840 Speaker 1: in the nothing. Yeah, but that's an interesting way of 855 00:48:00,960 --> 00:48:02,920 Speaker 1: learning about the nature of the mind. Right when you 856 00:48:03,000 --> 00:48:05,719 Speaker 1: take away all stimuli, you start to learn, well, what's 857 00:48:05,760 --> 00:48:08,040 Speaker 1: going on at the base level in my mind? What? 858 00:48:08,239 --> 00:48:10,239 Speaker 1: What it? What will it churn up when there's nothing 859 00:48:10,320 --> 00:48:13,000 Speaker 1: coming in? And so a similar thing would be showing 860 00:48:13,120 --> 00:48:17,399 Speaker 1: somebody randomness. Now, these studies aren't exactly pure randomness. They're 861 00:48:17,400 --> 00:48:20,719 Speaker 1: not totally black boxes because they're always priming the participants. 862 00:48:20,719 --> 00:48:23,160 Speaker 1: They're always saying, like, you might see a face, tell 863 00:48:23,239 --> 00:48:25,440 Speaker 1: me if you see a face in this image, And 864 00:48:25,600 --> 00:48:27,919 Speaker 1: under those conditions, it looks like when you show people 865 00:48:28,200 --> 00:48:30,920 Speaker 1: random noise that has no information in it and tell 866 00:48:31,000 --> 00:48:33,279 Speaker 1: them there might be a face, thirty something percent of 867 00:48:33,360 --> 00:48:36,000 Speaker 1: the time people tend to see a face that sounds 868 00:48:36,040 --> 00:48:40,359 Speaker 1: like paradolia is naturally pretty strong under people even who 869 00:48:40,400 --> 00:48:43,160 Speaker 1: are not like prone to hallucinations or anything. So I 870 00:48:43,239 --> 00:48:45,239 Speaker 1: think that's probably a point in in the favor of 871 00:48:45,320 --> 00:48:47,719 Speaker 1: dawkins explanation. Yeah, I mean, if you think about it 872 00:48:47,800 --> 00:48:50,719 Speaker 1: in terms of human evolution and what and what is 873 00:48:50,840 --> 00:48:54,840 Speaker 1: valuable environmental information, Uh, you know, a few things are 874 00:48:54,880 --> 00:48:57,840 Speaker 1: more important than the presence of another organism, because it 875 00:48:57,880 --> 00:48:59,680 Speaker 1: could be a prey organism, it could be a predator 876 00:48:59,800 --> 00:49:01,160 Speaker 1: organ as that it could be a member of your 877 00:49:01,200 --> 00:49:03,680 Speaker 1: own species, which brings with it a number of different 878 00:49:03,719 --> 00:49:06,960 Speaker 1: possibilities that tie into your survival, right, especially if it's 879 00:49:07,000 --> 00:49:08,719 Speaker 1: a member of your own species and you are a 880 00:49:08,920 --> 00:49:12,640 Speaker 1: social animal like we are. Like, I'm very convinced by 881 00:49:13,120 --> 00:49:17,600 Speaker 1: the idea that social behavior and managing social relationships is 882 00:49:17,640 --> 00:49:20,560 Speaker 1: one of the primary factors that shaped the evolution of 883 00:49:20,640 --> 00:49:24,240 Speaker 1: the anatomically modern human brain. Right, And I mentioned earlier 884 00:49:24,320 --> 00:49:26,960 Speaker 1: referred to the human face as a sensory array, and 885 00:49:27,320 --> 00:49:29,759 Speaker 1: part of that goes beyond just because just beyond the 886 00:49:29,840 --> 00:49:34,080 Speaker 1: fact that it is where our sense organs are our 887 00:49:34,200 --> 00:49:38,759 Speaker 1: group together, we also use facial expressions and micro expressions 888 00:49:38,840 --> 00:49:41,680 Speaker 1: to communicate with one another. It's and and we depend 889 00:49:41,719 --> 00:49:46,080 Speaker 1: on it far more than than other primates that have, say, 890 00:49:46,160 --> 00:49:51,160 Speaker 1: more uniform facial features. Our faces are have evolved to 891 00:49:51,600 --> 00:49:56,360 Speaker 1: to help convey meaning to other members of our species, yes, totally. 892 00:49:56,800 --> 00:49:59,720 Speaker 1: But also our brains have evolved to be on hyper 893 00:50:00,280 --> 00:50:03,240 Speaker 1: for faces. So it's not I mean, para idolia appears 894 00:50:03,320 --> 00:50:06,160 Speaker 1: to be strong for all kinds of things, but faces 895 00:50:06,239 --> 00:50:08,800 Speaker 1: are one of these things that were especially looking for 896 00:50:09,120 --> 00:50:12,719 Speaker 1: their dedicated pathways and structures within the brain that are 897 00:50:12,880 --> 00:50:16,040 Speaker 1: on alert to see a face and to start interpreting 898 00:50:16,120 --> 00:50:18,319 Speaker 1: what's up with the face when you see it? Yes, 899 00:50:18,400 --> 00:50:20,120 Speaker 1: and then what kind of intent is behind it? So 900 00:50:20,239 --> 00:50:23,920 Speaker 1: it's it's not that irrational really to imagine plucking a 901 00:50:24,000 --> 00:50:26,520 Speaker 1: crab out of the sea, looking at it and saying, Oh, 902 00:50:26,680 --> 00:50:28,239 Speaker 1: this crab has a face on its back, and I 903 00:50:28,280 --> 00:50:31,480 Speaker 1: think it's angry at me. Yea. Now, one last point 904 00:50:31,520 --> 00:50:34,920 Speaker 1: I want to talk about against the artificial selection hypothesis 905 00:50:35,000 --> 00:50:37,320 Speaker 1: that I thought it was very interesting and very straightforward 906 00:50:37,320 --> 00:50:39,680 Speaker 1: and simple. I came across this one in a short 907 00:50:39,800 --> 00:50:43,279 Speaker 1: two thousand ten blog post by an invertebrate biologist named 908 00:50:43,320 --> 00:50:46,360 Speaker 1: Michael Bach, and we've been talking about the Haika Gani 909 00:50:46,400 --> 00:50:49,920 Speaker 1: crab specifically, but the Haika Ghani crab is a member 910 00:50:50,080 --> 00:50:53,319 Speaker 1: of a whole family of crabs called Diripida. We might 911 00:50:53,440 --> 00:50:55,600 Speaker 1: we might have mentioned Diripida earlier, but we need to 912 00:50:55,680 --> 00:50:58,799 Speaker 1: remember they're all kinds of related crabs. And what Boch 913 00:50:58,880 --> 00:51:01,840 Speaker 1: pointed out is variety of crabs from the diripid A 914 00:51:01,920 --> 00:51:05,680 Speaker 1: family all have human looking faces on their backs, and 915 00:51:05,800 --> 00:51:08,960 Speaker 1: lots of these crabs don't even exist in human fisheries, 916 00:51:09,560 --> 00:51:14,040 Speaker 1: so there's no tradition of humans catching them and potentially 917 00:51:14,520 --> 00:51:17,640 Speaker 1: keeping or releasing them based on the designs on their backs, right, 918 00:51:17,680 --> 00:51:20,560 Speaker 1: there's no way they could have been shaped by artificial selection, 919 00:51:20,640 --> 00:51:23,879 Speaker 1: and yet they look like faces anyway. So to test 920 00:51:23,960 --> 00:51:25,879 Speaker 1: this out for myself, I wanted to look up other 921 00:51:25,960 --> 00:51:28,000 Speaker 1: crabs in the diripid A family. It is. It does 922 00:51:28,040 --> 00:51:30,480 Speaker 1: appear to be a pretty obscure crab family. It's not 923 00:51:30,640 --> 00:51:34,120 Speaker 1: stuff that has you know, like really storied species lots 924 00:51:34,160 --> 00:51:37,200 Speaker 1: of articles about them. But I did discover, to my delight, 925 00:51:37,320 --> 00:51:41,040 Speaker 1: there is an Internet crab database. Thank the Gods for 926 00:51:41,160 --> 00:51:44,200 Speaker 1: such a thing. Internet crab database, and some of the 927 00:51:44,480 --> 00:51:47,520 Speaker 1: entries have images with them. So I wanted before we 928 00:51:47,600 --> 00:51:49,840 Speaker 1: wrap up, to look at a little more para Idoli 929 00:51:49,920 --> 00:51:53,440 Speaker 1: a bait from Family diripid A. So first I've included 930 00:51:53,440 --> 00:51:55,600 Speaker 1: a picture for us to look at of do rip 931 00:51:55,680 --> 00:51:59,640 Speaker 1: A Quadridon's what does this look like? This one looks 932 00:51:59,719 --> 00:52:02,360 Speaker 1: kind of Darth Maul or like a giraffe. Yeah, it 933 00:52:02,400 --> 00:52:04,759 Speaker 1: looks like Darth Maul. It also kind of looks like 934 00:52:04,840 --> 00:52:07,279 Speaker 1: a spider face. Do you see that? Yeah? Well, I 935 00:52:07,320 --> 00:52:09,160 Speaker 1: mean it's hard not to look at a crab and 936 00:52:09,200 --> 00:52:12,719 Speaker 1: get a certain certain arachnid feel for them. Right, well, 937 00:52:12,760 --> 00:52:14,960 Speaker 1: we're looking at a crab top down, but it looks 938 00:52:15,040 --> 00:52:18,960 Speaker 1: sort of like she lab faced on. Yeah, it does. 939 00:52:19,680 --> 00:52:24,000 Speaker 1: How about dripoides fashiono. What does this look like? All right, well, 940 00:52:24,080 --> 00:52:26,640 Speaker 1: this one definitely has kind of a samurai mask will 941 00:52:26,680 --> 00:52:29,200 Speaker 1: look to it. But also it reminded me a lot 942 00:52:29,320 --> 00:52:34,200 Speaker 1: of the character Ponda Baba from Star Wars, walrus looking 943 00:52:34,320 --> 00:52:37,719 Speaker 1: character in the Cantina. Oh yeah, the bug eyes. Yeah, 944 00:52:37,880 --> 00:52:41,279 Speaker 1: he doesn't like you, that guy, that one. That's what 945 00:52:41,400 --> 00:52:43,920 Speaker 1: I see. I see like a stylised samurai version of 946 00:52:44,000 --> 00:52:48,359 Speaker 1: that character. In this crab, I see straight up predator mask. Yeah. Yeah, 947 00:52:49,920 --> 00:52:51,840 Speaker 1: you know, a lot of this reminds me of that 948 00:52:52,200 --> 00:52:55,279 Speaker 1: that that common scenario where you're looking up at the 949 00:52:55,360 --> 00:52:58,440 Speaker 1: clouds with a friend, and one person sees this animal 950 00:52:58,719 --> 00:53:00,960 Speaker 1: or this face or this object, and then you see 951 00:53:00,960 --> 00:53:04,200 Speaker 1: another one. And when you when you present the data 952 00:53:04,280 --> 00:53:05,800 Speaker 1: to someone, they're like, oh, yeah, I can see that. 953 00:53:05,840 --> 00:53:07,840 Speaker 1: I can see a unicorn. I was seeing uh, I 954 00:53:07,920 --> 00:53:10,280 Speaker 1: was seeing a whale, but now I can see the unicorn. 955 00:53:10,320 --> 00:53:12,760 Speaker 1: And now I can't unsee the unicorn. So I've primed 956 00:53:12,760 --> 00:53:15,600 Speaker 1: you for predators. Now yeah, okay, Now we got to 957 00:53:15,600 --> 00:53:18,520 Speaker 1: look at a couple of pictures of Medora pe lenata. 958 00:53:18,920 --> 00:53:21,800 Speaker 1: What do you see here, Robert? This one reminds me 959 00:53:21,920 --> 00:53:24,880 Speaker 1: of some of the creatures in the movie The Diiver. 960 00:53:25,080 --> 00:53:27,320 Speaker 1: Did you ever see that? No, I've never seen that. 961 00:53:27,440 --> 00:53:29,400 Speaker 1: Which so we've got two pictures. One it's sort of 962 00:53:29,560 --> 00:53:32,520 Speaker 1: standing up and it's it looks like a face to me, 963 00:53:32,680 --> 00:53:35,279 Speaker 1: but it's got its swimming legs hanging off the back 964 00:53:35,719 --> 00:53:38,879 Speaker 1: and they're sort of hairy looking, so it actually looks 965 00:53:38,960 --> 00:53:41,120 Speaker 1: like a person with like a foo man chew mustache. 966 00:53:41,440 --> 00:53:43,719 Speaker 1: Oh see, well, when I looked at this picture, I 967 00:53:44,040 --> 00:53:46,280 Speaker 1: saw it looked like it's flipping. It's giving the bird 968 00:53:46,440 --> 00:53:49,000 Speaker 1: like double birds. Oh, it's got the fingers coming up 969 00:53:49,000 --> 00:53:50,680 Speaker 1: in the air. So yeah, it's somebody with a big, 970 00:53:50,760 --> 00:53:53,279 Speaker 1: long food man chew mustache, but it's also flipping the 971 00:53:53,320 --> 00:53:55,040 Speaker 1: birds up in the air. See. I mainly saw a 972 00:53:55,080 --> 00:53:57,080 Speaker 1: stone called Steve Auston when I looked at it because 973 00:53:57,120 --> 00:53:59,480 Speaker 1: of the fingers. But then the next one that you shared, 974 00:53:59,640 --> 00:54:01,560 Speaker 1: this one, the one that reminds me of the guy 975 00:54:01,560 --> 00:54:03,319 Speaker 1: every one, it's more of a picture of its face. 976 00:54:04,360 --> 00:54:06,440 Speaker 1: Bringing it all back home to me. That looks like 977 00:54:06,560 --> 00:54:09,319 Speaker 1: the villain the Giant Crab in Attack of the Crab 978 00:54:09,400 --> 00:54:12,880 Speaker 1: Monsters because it has these kind of sad, droopy human eyes. 979 00:54:13,360 --> 00:54:15,360 Speaker 1: It does look like that. Yeah, it reminds me a 980 00:54:15,440 --> 00:54:18,319 Speaker 1: lot of of this movie that has come up before. Um, 981 00:54:18,680 --> 00:54:21,240 Speaker 1: I think on the podcast, but definitely on the Trailer 982 00:54:21,280 --> 00:54:23,640 Speaker 1: Talk video series that we did for a while. Well. Anyway, 983 00:54:23,719 --> 00:54:26,480 Speaker 1: as as Box points out in his blog post, all 984 00:54:26,560 --> 00:54:29,160 Speaker 1: these crabs to some extent look like human faces, not 985 00:54:29,320 --> 00:54:31,520 Speaker 1: all of them could have been shaped by fisher folk. 986 00:54:31,840 --> 00:54:35,400 Speaker 1: So while I would not rule out the possibility that 987 00:54:35,719 --> 00:54:39,399 Speaker 1: certain species of crabs with you know, symmetry on their 988 00:54:39,400 --> 00:54:41,560 Speaker 1: backs and things that look kind of like faces could 989 00:54:41,600 --> 00:54:44,880 Speaker 1: have been honed by artificial selection, it's it's possible that 990 00:54:45,320 --> 00:54:49,480 Speaker 1: fishing practices and throwing things back could have maybe sharpened 991 00:54:49,600 --> 00:54:53,279 Speaker 1: the features. I wouldn't use that to explain the emergence 992 00:54:53,360 --> 00:54:57,040 Speaker 1: of the features themselves, right, Yeah, that's that's pretty much 993 00:54:57,440 --> 00:55:00,440 Speaker 1: my read on it too. Like the situa wation that 994 00:55:00,600 --> 00:55:03,520 Speaker 1: that's Sagan especially is laying out here is not at 995 00:55:03,560 --> 00:55:07,839 Speaker 1: all unbelievable or or or unscientific. It's just not necessary. Yeah, 996 00:55:08,000 --> 00:55:11,360 Speaker 1: in the in the the evidence against it seems a 997 00:55:11,440 --> 00:55:14,960 Speaker 1: little too strong. Yeah, And it's not necessary. And if 998 00:55:15,000 --> 00:55:17,719 Speaker 1: it's not necessary in science, that means it doesn't pass 999 00:55:17,760 --> 00:55:20,040 Speaker 1: the test of parsimony. Right. It's it's you don't need 1000 00:55:20,120 --> 00:55:23,120 Speaker 1: to invoke explanations that are not required. Right. It's like 1001 00:55:23,239 --> 00:55:26,520 Speaker 1: again involving an alien species visiting the earth and and 1002 00:55:26,880 --> 00:55:29,920 Speaker 1: doodling faces on the backs of the craft. Right, It's 1003 00:55:29,960 --> 00:55:33,080 Speaker 1: more plausible than that, but it's still just as unnecessary. 1004 00:55:34,000 --> 00:55:35,879 Speaker 1: I now, I also need to point out again though, 1005 00:55:35,920 --> 00:55:39,440 Speaker 1: that artificial selection is definitely a thing. We already discussed 1006 00:55:40,280 --> 00:55:44,400 Speaker 1: the selective breeding of various organisms for human purposes, everything 1007 00:55:44,440 --> 00:55:47,280 Speaker 1: from horses and cattle to crops to domestic dog breeds. 1008 00:55:47,520 --> 00:55:50,880 Speaker 1: There's also some evidence for the artificial selection of tustless 1009 00:55:51,040 --> 00:55:54,560 Speaker 1: elephants due to human poaching. Yeah, Oh, artificial selection is 1010 00:55:54,800 --> 00:55:58,000 Speaker 1: absolutely something that happens all the time. And so that 1011 00:55:58,120 --> 00:55:59,960 Speaker 1: feeds into another thing I want to say, which is 1012 00:56:00,120 --> 00:56:03,560 Speaker 1: that I feel really disappointed to lose this theory. It 1013 00:56:03,719 --> 00:56:08,640 Speaker 1: feels sad, it's such it's a wonderful, beautiful explanation of 1014 00:56:08,760 --> 00:56:12,320 Speaker 1: an actual scientific reality. Yeah, and I know we're not alone. 1015 00:56:12,400 --> 00:56:15,480 Speaker 1: Like Dawkins commented that the Huxley slash Sagan story was 1016 00:56:15,600 --> 00:56:18,640 Speaker 1: quote lovely, and he hated that he had to disagree 1017 00:56:18,680 --> 00:56:20,759 Speaker 1: with it. And I see other writers and scientists around 1018 00:56:20,800 --> 00:56:24,360 Speaker 1: the web expressing similar feelings. They're like, it's probably not correct, 1019 00:56:24,520 --> 00:56:26,640 Speaker 1: but I hate to say that. I really want it 1020 00:56:26,719 --> 00:56:30,080 Speaker 1: to be true. Why do we hate to lose this 1021 00:56:30,239 --> 00:56:35,200 Speaker 1: explanatory story about artificial selection? Like it's not necessary to 1022 00:56:35,360 --> 00:56:39,280 Speaker 1: provide an example of anything. We have a million examples 1023 00:56:39,320 --> 00:56:42,200 Speaker 1: of artificial selection without it, So why can't we bear 1024 00:56:42,280 --> 00:56:44,600 Speaker 1: to let it go? Because I think I think it's that, 1025 00:56:44,760 --> 00:56:47,040 Speaker 1: First of all, it's the accidental aspect of it, the 1026 00:56:47,200 --> 00:56:49,279 Speaker 1: idea that we're just we're doing it. We're not even 1027 00:56:49,360 --> 00:56:53,319 Speaker 1: realizing we're doing it, that we're we're behaving as mad gods. Yeah, 1028 00:56:53,520 --> 00:56:57,960 Speaker 1: without realizing it. It's artificial selection working without the knowledge 1029 00:56:58,040 --> 00:57:01,040 Speaker 1: of the breeders, Like all of the magic of intention 1030 00:57:01,200 --> 00:57:03,880 Speaker 1: is removed. And this actually does called call into question 1031 00:57:04,239 --> 00:57:08,120 Speaker 1: the very concept of artificial selection. Right, Why do we 1032 00:57:09,040 --> 00:57:13,239 Speaker 1: have a different category for changes that we make to 1033 00:57:13,400 --> 00:57:18,720 Speaker 1: organisms on purpose over time versus changes that happen to organisms, 1034 00:57:19,280 --> 00:57:22,520 Speaker 1: uh due to pressures from different organisms over time. Like 1035 00:57:22,800 --> 00:57:25,840 Speaker 1: so if a dog or if a if a dog 1036 00:57:25,920 --> 00:57:28,840 Speaker 1: ancestor and you know, some kind of ancestral wolf is 1037 00:57:29,000 --> 00:57:32,120 Speaker 1: shaped by the evolution of a different species. So one 1038 00:57:32,120 --> 00:57:34,760 Speaker 1: of its prey animals or some animal that could hurt 1039 00:57:34,880 --> 00:57:38,959 Speaker 1: it ends up shaping the evolution of this candid over time. 1040 00:57:39,200 --> 00:57:42,280 Speaker 1: You wouldn't call that artificial, you'd call it natural. But 1041 00:57:42,480 --> 00:57:47,560 Speaker 1: if another organism, that is a relatively smooth bipedal primate 1042 00:57:47,960 --> 00:57:50,600 Speaker 1: shapes the evolution of that dog for some reason, that's 1043 00:57:50,640 --> 00:57:53,040 Speaker 1: the one exception we make, and we call that artificial 1044 00:57:53,120 --> 00:57:57,160 Speaker 1: selection instead of natural. Maybe it's all natural selection. We 1045 00:57:57,280 --> 00:58:00,600 Speaker 1: are animals too, and then the selection ussures that we 1046 00:58:00,760 --> 00:58:03,800 Speaker 1: exert on the natural world are an outgrowth of our 1047 00:58:03,920 --> 00:58:06,920 Speaker 1: genotype and our phenotype. Well, you know, there's there's one 1048 00:58:07,000 --> 00:58:10,360 Speaker 1: example from the natural world especially that we should consider 1049 00:58:10,480 --> 00:58:13,520 Speaker 1: coming back to and that is, uh, that of leaf 1050 00:58:13,600 --> 00:58:16,920 Speaker 1: cutter ants. You have a creature here with essentially an 1051 00:58:16,920 --> 00:58:21,200 Speaker 1: agricultural product. Yeah, absolutely so is the agricultural product that 1052 00:58:21,400 --> 00:58:25,000 Speaker 1: is farmed by the ant an example of artificial selection? 1053 00:58:25,240 --> 00:58:28,240 Speaker 1: I don't think so, Right, You'd still say that that's natural. 1054 00:58:28,400 --> 00:58:31,360 Speaker 1: So if that's natural, why aren't all the things that 1055 00:58:31,520 --> 00:58:35,439 Speaker 1: we breed, whether intentionally or unintentionally, natural as well? Well? 1056 00:58:35,640 --> 00:58:38,760 Speaker 1: I think on one level it's there's the there's the 1057 00:58:38,880 --> 00:58:42,560 Speaker 1: fact that humans can do things to an extreme level 1058 00:58:43,200 --> 00:58:46,920 Speaker 1: that other species cannot do. You know, we can. Well, 1059 00:58:46,960 --> 00:58:48,720 Speaker 1: I don't know if i'd agree with you there, because 1060 00:58:49,160 --> 00:58:53,320 Speaker 1: the I mean, other species can shape the animals and 1061 00:58:53,520 --> 00:58:58,480 Speaker 1: the organisms they interact with in really extreme and strange ways. Right, Yeah, 1062 00:58:58,600 --> 00:59:02,680 Speaker 1: But I mean, certainly organism other organisms can cause other 1063 00:59:02,880 --> 00:59:06,120 Speaker 1: organisms to go extinct. They can, they can and do 1064 00:59:06,360 --> 00:59:10,000 Speaker 1: change their natural habitat. But can you think I mean, 1065 00:59:10,240 --> 00:59:13,240 Speaker 1: but but the sheer scope of human change, I mean, 1066 00:59:13,600 --> 00:59:17,040 Speaker 1: the sheer amount of change that we have brought about 1067 00:59:17,080 --> 00:59:19,280 Speaker 1: in the world during our brief time on this on 1068 00:59:19,360 --> 00:59:24,160 Speaker 1: this Earth, we probably shape the evolution of other organisms, 1069 00:59:24,320 --> 00:59:28,680 Speaker 1: maybe more than does any other organism on Earth outside 1070 00:59:28,800 --> 00:59:31,920 Speaker 1: of microbes. And then there's also the added level that 1071 00:59:32,040 --> 00:59:36,360 Speaker 1: we do so we we we achieve this change via 1072 00:59:36,520 --> 00:59:40,040 Speaker 1: our conscious understanding of the world. I guess consciousness is 1073 00:59:40,080 --> 00:59:43,200 Speaker 1: what's key here. And in that sense, then if the 1074 00:59:43,280 --> 00:59:47,120 Speaker 1: Sagan Huxley theory were correct, then then it wouldn't be 1075 00:59:47,240 --> 00:59:51,200 Speaker 1: artificial selection, would it, because they weren't doing it on purpose. Yeah. 1076 00:59:51,240 --> 00:59:53,320 Speaker 1: I think that's a strong argument. I guess that probably 1077 00:59:53,360 --> 00:59:55,400 Speaker 1: does it for today. But I'm disappointed we don't get 1078 00:59:55,440 --> 00:59:57,360 Speaker 1: to spend another twenty minutes talking about attack of the 1079 00:59:57,400 --> 00:59:59,840 Speaker 1: crab monsters. Well, this is why we have to bring 1080 01:00:00,000 --> 01:00:02,760 Speaker 1: at trailer talk at least in an audio form, so 1081 01:00:02,840 --> 01:00:07,200 Speaker 1: that we will have space for our movie references to breathe. 1082 01:00:07,800 --> 01:00:09,840 Speaker 1: I can't wait, all right, So hey, in the meantime, 1083 01:00:09,920 --> 01:00:11,640 Speaker 1: you want to check out other episodes of Stuff to 1084 01:00:11,680 --> 01:00:13,560 Speaker 1: Blow Your Mind. Heading over to stuff to Blow your 1085 01:00:13,600 --> 01:00:15,760 Speaker 1: Mind dot com, you'll find all of the episodes there. 1086 01:00:15,800 --> 01:00:18,880 Speaker 1: You also find blog posts some other content links out 1087 01:00:18,880 --> 01:00:21,680 Speaker 1: to our various social media accounts such as Facebook, Twitter, Tumbler, 1088 01:00:21,720 --> 01:00:24,360 Speaker 1: and Instagram. Great big thank you as always to our 1089 01:00:24,440 --> 01:00:28,120 Speaker 1: awesome audio producers Alex Williams and Arry Harrison. And if 1090 01:00:28,160 --> 01:00:29,960 Speaker 1: you want to get in touch with us directly. It'll 1091 01:00:30,040 --> 01:00:32,440 Speaker 1: let us know feedback on this episode or any other, 1092 01:00:32,920 --> 01:00:35,480 Speaker 1: or to request an episode for the future, or just 1093 01:00:35,600 --> 01:00:37,800 Speaker 1: to say hi and see what's up. You can email 1094 01:00:37,880 --> 01:00:40,840 Speaker 1: us at blow the Mind at how stuff works dot 1095 01:00:40,920 --> 01:00:52,960 Speaker 1: com for more on this and thousands of other topics. 1096 01:00:53,160 --> 01:01:10,560 Speaker 1: Does it how stuff works dot com points four point 1097 01:01:10,720 --> 01:01:11,960 Speaker 1: four per