1 00:00:00,880 --> 00:00:04,120 Speaker 1: Hey, everybody. In this week's Select, it's Josh. By the way. 2 00:00:04,320 --> 00:00:07,480 Speaker 1: We take a look at the Uncanny Valley, a theoretical 3 00:00:07,520 --> 00:00:10,560 Speaker 1: phenomenon where robots give us the creeps as they get 4 00:00:10,560 --> 00:00:13,680 Speaker 1: closer to looking human but still don't seem quite right, 5 00:00:14,240 --> 00:00:18,000 Speaker 1: and we explore academic research into the creeps, which is 6 00:00:18,079 --> 00:00:26,279 Speaker 1: pretty cool. So enjoy this episode. Welcome to Stuff You 7 00:00:26,280 --> 00:00:35,760 Speaker 1: Should Know, a production of iHeartRadio. Hey, and welcome to 8 00:00:35,800 --> 00:00:39,000 Speaker 1: the podcast. I'm Josh Clark, and there's Charles w Chuck Bryant, 9 00:00:39,000 --> 00:00:42,880 Speaker 1: and there's Jerry over there doing the robot, which means 10 00:00:42,920 --> 00:00:48,560 Speaker 1: this is stuff you should Know robot style. I knew 11 00:00:48,560 --> 00:00:50,680 Speaker 1: i'd get a laugh out you sooner or later. Did 12 00:00:50,720 --> 00:00:53,040 Speaker 1: you do the robot? Can you do the robot? I 13 00:00:53,040 --> 00:00:55,960 Speaker 1: think I've seen you do pretty bad robot. Sure, I 14 00:00:56,040 --> 00:00:58,920 Speaker 1: don't know about pretty bad robot. I can do a 15 00:00:58,960 --> 00:01:02,560 Speaker 1: pretty great robot, that's what you've seen. I can't do 16 00:01:02,560 --> 00:01:06,440 Speaker 1: any of that stuff. Yeah, I can't really either. And then, really, 17 00:01:07,600 --> 00:01:10,480 Speaker 1: if your claim to fame as like a really great 18 00:01:10,600 --> 00:01:14,200 Speaker 1: robot dance, I don't know, maybe take up some other 19 00:01:14,280 --> 00:01:17,720 Speaker 1: hobbies as well. Kind of round that out. You don't 20 00:01:17,720 --> 00:01:20,920 Speaker 1: want that to be the only thing you're good at, right, 21 00:01:21,000 --> 00:01:24,080 Speaker 1: because if you list that on a dating site, you 22 00:01:24,160 --> 00:01:29,720 Speaker 1: might turn ladies off. Yeah, according to e Harmony. Yeah 23 00:01:29,720 --> 00:01:33,440 Speaker 1: that's foreshadowing. I love that one, don't you. Yeah, it's 24 00:01:33,440 --> 00:01:36,800 Speaker 1: some issues with that whole Oh yeah, bit, Yeah, we'll 25 00:01:36,800 --> 00:01:39,360 Speaker 1: get to that. This is tad all right. Well, let's 26 00:01:39,360 --> 00:01:41,880 Speaker 1: start at the at the beginning, almost the beginning, Chuck, 27 00:01:41,959 --> 00:01:44,959 Speaker 1: Let's go back to nineteen seventy, which was the beginning 28 00:01:45,000 --> 00:01:48,040 Speaker 1: of the greatest decade in the history of humanity. Yeah, 29 00:01:48,200 --> 00:01:50,400 Speaker 1: neither one of us are born yet. I can finally 30 00:01:50,440 --> 00:01:53,560 Speaker 1: even say that I'm still not even born. It must 31 00:01:53,600 --> 00:01:59,120 Speaker 1: feel good. Yeah, okay, well, welcome to the club. I 32 00:01:59,200 --> 00:02:02,640 Speaker 1: guess thanks. In nineteen seventy, we're not just going just 33 00:02:02,720 --> 00:02:05,720 Speaker 1: anywhere in nineteen seven, We're going to Japan in nineteen seventy. 34 00:02:05,960 --> 00:02:08,920 Speaker 1: I bet Japan was pretty cool in the seventies. Yeah. 35 00:02:08,960 --> 00:02:12,959 Speaker 1: A lot of bell bottoms, a lot of ninja running around. Still. Yeah, 36 00:02:14,280 --> 00:02:19,680 Speaker 1: there were calculators being wheelded all over the place. Probably 37 00:02:19,840 --> 00:02:22,600 Speaker 1: it was a good time, good time for Japan, right, 38 00:02:23,919 --> 00:02:26,640 Speaker 1: And one of the one of the things that was 39 00:02:26,720 --> 00:02:29,600 Speaker 1: going on in nineteen seventy I could not for the 40 00:02:29,639 --> 00:02:33,799 Speaker 1: life of me, find what issue of this journal it 41 00:02:33,919 --> 00:02:36,080 Speaker 1: came out in what month? But at some point in 42 00:02:36,120 --> 00:02:41,520 Speaker 1: nineteen seventy there was an obscure journal, a Japanese academic 43 00:02:41,600 --> 00:02:45,320 Speaker 1: journal called Energy, and at some point during that year 44 00:02:45,400 --> 00:02:50,760 Speaker 1: it published a article by a Japanese roboticist. And his 45 00:02:50,919 --> 00:02:55,360 Speaker 1: name is Massa Hiro Morri. Nice, thank you, I have 46 00:02:55,400 --> 00:02:59,840 Speaker 1: a lot of practice. And Massa Hiro Amori published this 47 00:03:00,080 --> 00:03:03,280 Speaker 1: article and he named it Bukimi no tani gen show 48 00:03:03,960 --> 00:03:06,320 Speaker 1: is actually the full name of the whole thing. And 49 00:03:06,400 --> 00:03:11,000 Speaker 1: as we'll see, it's kind of difficult to translate into English, right, 50 00:03:11,960 --> 00:03:15,480 Speaker 1: and it took many, many years after he wrote this 51 00:03:15,639 --> 00:03:18,840 Speaker 1: article for it to be translated into English for anybody 52 00:03:18,840 --> 00:03:22,400 Speaker 1: even tried to attempt it. So basically more was this 53 00:03:23,240 --> 00:03:26,840 Speaker 1: roboticist and he wrote this essay, and at the time 54 00:03:26,840 --> 00:03:29,240 Speaker 1: he just put it out there and went back to work, 55 00:03:29,680 --> 00:03:33,080 Speaker 1: started teaching more and more roboticists. The whole new generation 56 00:03:33,120 --> 00:03:36,320 Speaker 1: of roboticists learned under him, and his work just kind 57 00:03:36,360 --> 00:03:41,400 Speaker 1: of sat there unobserved that article, i should say. And 58 00:03:41,440 --> 00:03:45,120 Speaker 1: then in two thousand and five a rough translation of 59 00:03:45,160 --> 00:03:48,920 Speaker 1: it was leaked out wasn't intended for publication, and the 60 00:03:48,960 --> 00:03:54,600 Speaker 1: world entirely changed, right because massa hiro Omori had in 61 00:03:54,760 --> 00:03:57,680 Speaker 1: his article put his finger on something that no one 62 00:03:57,760 --> 00:04:01,200 Speaker 1: had before. In his capacities are roboticists and a human, 63 00:04:01,680 --> 00:04:07,800 Speaker 1: and that was what we call today the uncanny Valley. Yeah, 64 00:04:07,840 --> 00:04:12,840 Speaker 1: so that's um the idea that you're making a robot, 65 00:04:13,040 --> 00:04:15,240 Speaker 1: and we'll see this apply to more than just robots, 66 00:04:15,720 --> 00:04:17,560 Speaker 1: but in his case, you're making a robot and you 67 00:04:17,600 --> 00:04:21,720 Speaker 1: want to make it look like a person, which I 68 00:04:21,760 --> 00:04:24,479 Speaker 1: guess not all roboticists do. Some of them like the 69 00:04:24,520 --> 00:04:28,239 Speaker 1: clunky jets and style robots like Rosy, But I guess 70 00:04:28,240 --> 00:04:31,159 Speaker 1: if you're morey, you're you're on the path to designing 71 00:04:31,560 --> 00:04:36,479 Speaker 1: lifelike robots. And the closer you get to that lifelike look, 72 00:04:36,720 --> 00:04:39,440 Speaker 1: everything's going great, everything's going great, people are like, this 73 00:04:39,480 --> 00:04:41,640 Speaker 1: is so cool, this is so cool. Yeah. Then all 74 00:04:41,640 --> 00:04:45,400 Speaker 1: of a sudden, people go oh, like, right as it 75 00:04:45,400 --> 00:04:49,120 Speaker 1: approaches it's most or basically, when it reaches it's most 76 00:04:49,160 --> 00:04:54,440 Speaker 1: lifelike capacity, that this whoever's making it can conjure people 77 00:04:54,440 --> 00:04:59,440 Speaker 1: are repulsed by it. Yeah, which is something that most 78 00:04:59,440 --> 00:05:02,560 Speaker 1: people who whoever hear of the Uncanny Valley or like, yeah, 79 00:05:02,600 --> 00:05:05,039 Speaker 1: you know, that's I've noticed that that's happened to me 80 00:05:05,120 --> 00:05:08,120 Speaker 1: before too. But the thing is, Chuck, it doesn't it 81 00:05:08,160 --> 00:05:10,760 Speaker 1: doesn't actually make sense, right, Like we know a robot 82 00:05:10,880 --> 00:05:14,800 Speaker 1: is a robot. Yeah, so you know, maybe you could 83 00:05:14,800 --> 00:05:16,760 Speaker 1: be afraid that's gonna like pick you up and break 84 00:05:16,760 --> 00:05:19,600 Speaker 1: you in two or something like a cartoon. But that's 85 00:05:19,640 --> 00:05:21,960 Speaker 1: different than being creeped out by Like why would we 86 00:05:22,040 --> 00:05:23,880 Speaker 1: be creeped out by a robot? And this is what 87 00:05:23,960 --> 00:05:27,920 Speaker 1: Morey put his finger on, was there's something to this 88 00:05:28,160 --> 00:05:32,360 Speaker 1: and it doesn't make sense. And he didn't. It wasn't 89 00:05:32,400 --> 00:05:35,960 Speaker 1: even just um this article that he wrote. He created 90 00:05:36,000 --> 00:05:39,920 Speaker 1: a graph as well that's become quite famous that um 91 00:05:40,880 --> 00:05:43,840 Speaker 1: really kind of gets the point across more than anything else. Yeah, 92 00:05:43,839 --> 00:05:46,240 Speaker 1: And he wasn't even the first person to go over 93 00:05:46,320 --> 00:05:48,640 Speaker 1: this and to put a put some thought to it. 94 00:05:49,440 --> 00:05:52,440 Speaker 1: Freud of course, because he'd like to think about everything. 95 00:05:53,320 --> 00:05:56,080 Speaker 1: He thought about a little bit. And before Freud, there 96 00:05:56,160 --> 00:06:01,599 Speaker 1: was a Geman name anst Yinched. Oh nice, I did 97 00:06:01,600 --> 00:06:04,080 Speaker 1: not realize that's how his last name should be pronounced. Yinch. 98 00:06:04,600 --> 00:06:06,040 Speaker 1: That's good stuff. I think I put a T on 99 00:06:06,080 --> 00:06:09,320 Speaker 1: the end, but the teas in the middle. Yinch. Yeah, 100 00:06:09,400 --> 00:06:12,960 Speaker 1: I think that's right. I've been saying Ginch, Well, we'll 101 00:06:12,960 --> 00:06:14,680 Speaker 1: have to look that up. Then I think you're no. 102 00:06:14,839 --> 00:06:16,240 Speaker 1: I think of the two of us, you you get 103 00:06:16,240 --> 00:06:20,480 Speaker 1: the German down. And he had a little term called 104 00:06:20,560 --> 00:06:24,599 Speaker 1: Umheimlich that he called it so like you know, different 105 00:06:24,640 --> 00:06:28,400 Speaker 1: languages had different names for it, and you'd go back 106 00:06:28,440 --> 00:06:30,919 Speaker 1: in time, all the way back to like the seventeenth century, 107 00:06:31,360 --> 00:06:34,280 Speaker 1: and people were and I guess you know, robots didn't 108 00:06:34,279 --> 00:06:36,839 Speaker 1: look super lifelike back then, but whatever their version of 109 00:06:36,920 --> 00:06:41,280 Speaker 1: lifelike was in the sixteen hundreds, people were like, uh, 110 00:06:41,720 --> 00:06:44,160 Speaker 1: I don't like that. Why is it looking at me? Yeah, 111 00:06:44,200 --> 00:06:47,000 Speaker 1: it's got a quill and it's writing things. But like 112 00:06:47,040 --> 00:06:51,440 Speaker 1: you said, Moriy made this graph because he was a roboticist, 113 00:06:51,480 --> 00:06:53,200 Speaker 1: and he thought, you know, let's look at this on 114 00:06:53,240 --> 00:06:57,039 Speaker 1: a plotted out so we can stare at it. And 115 00:06:57,279 --> 00:07:01,360 Speaker 1: on the x axis he had likeness, then on the 116 00:07:01,480 --> 00:07:04,160 Speaker 1: y axis he had affinity, like whether or not you 117 00:07:04,240 --> 00:07:07,200 Speaker 1: like the way this thing looks? Right? And just as 118 00:07:07,400 --> 00:07:09,960 Speaker 1: we're talking about um the graph went up and up 119 00:07:10,360 --> 00:07:13,480 Speaker 1: as things got more lifelike and people like the way 120 00:07:13,480 --> 00:07:15,840 Speaker 1: it looked. And then at a certain point there is 121 00:07:15,840 --> 00:07:20,480 Speaker 1: that valley, there's a big dip that really just kind 122 00:07:20,480 --> 00:07:24,800 Speaker 1: of says it all right, And again, this all makes 123 00:07:24,800 --> 00:07:28,120 Speaker 1: sense intuitively, but as we'll see, that's it's been very 124 00:07:28,160 --> 00:07:31,080 Speaker 1: difficult to prove. And one of the reasons why it's 125 00:07:31,240 --> 00:07:36,600 Speaker 1: confounded research thus far is because we're not even one 126 00:07:36,680 --> 00:07:40,120 Speaker 1: hundred percent sure what Morey meant by some of the 127 00:07:40,200 --> 00:07:42,680 Speaker 1: words he chose, at least as far as translating them 128 00:07:42,760 --> 00:07:48,600 Speaker 1: to English. Right. For example, bukimi Right, yeah, it was 129 00:07:48,640 --> 00:07:53,280 Speaker 1: translated in two thousand and five as uncanny, but um 130 00:07:53,400 --> 00:07:56,960 Speaker 1: again that that original translation was not intended for publication, 131 00:07:57,360 --> 00:08:01,360 Speaker 1: but it leaked out, and so uncanny Anny Valley became, 132 00:08:01,480 --> 00:08:04,680 Speaker 1: you know, the way we all think of it here 133 00:08:04,680 --> 00:08:09,920 Speaker 1: in the West. But boukimi more closely resembles something like eerie. 134 00:08:10,360 --> 00:08:14,320 Speaker 1: Like I've seen it explain that a word like boukimi 135 00:08:15,000 --> 00:08:19,480 Speaker 1: means more than Uncanny's just weird or remarkable or noteworthy. 136 00:08:19,520 --> 00:08:22,520 Speaker 1: It's not necessarily something that gives you the creeps. Boukimi 137 00:08:22,600 --> 00:08:28,240 Speaker 1: is something that gives you the creeps, like Steve Bukimi's exactly. 138 00:08:29,520 --> 00:08:34,160 Speaker 1: So boukimi probably more should be should have been translated 139 00:08:34,200 --> 00:08:37,960 Speaker 1: the Erie Valley, but by the time an actual official 140 00:08:38,040 --> 00:08:42,719 Speaker 1: translation that m maury signed off on came out in 141 00:08:42,760 --> 00:08:45,640 Speaker 1: twenty twelve. The cat was out of the bag. Everybody 142 00:08:45,720 --> 00:08:47,880 Speaker 1: knew of it as the Uncanny Valley, and there's no 143 00:08:47,920 --> 00:08:50,120 Speaker 1: way anybody who's going to come back and be like, no, no, no, 144 00:08:50,160 --> 00:08:54,200 Speaker 1: everybody stop calling it that. It's now the Eerie Valley, okay, 145 00:08:55,520 --> 00:08:57,839 Speaker 1: right right, And it may be one of those things 146 00:08:57,880 --> 00:09:00,200 Speaker 1: where we're so used to Uncanny Valley now that it's 147 00:09:00,200 --> 00:09:02,480 Speaker 1: hard to imagine Eerie Valley. But right, I think that 148 00:09:02,559 --> 00:09:05,520 Speaker 1: was the issue. Yeah, Like, nobody's gonna go along with that. 149 00:09:06,640 --> 00:09:09,080 Speaker 1: So this graph, like I said, it starts off on 150 00:09:09,120 --> 00:09:11,400 Speaker 1: that left hand side, and this is where you have 151 00:09:11,480 --> 00:09:16,280 Speaker 1: things that are super robotic, like you know, a packaging 152 00:09:16,520 --> 00:09:21,880 Speaker 1: robot in a factory, right that you know, apparently most 153 00:09:21,920 --> 00:09:24,280 Speaker 1: people don't have fondness for I do because I love 154 00:09:24,559 --> 00:09:29,320 Speaker 1: mechanical processes, right right, Okay, So there's there's part of 155 00:09:29,320 --> 00:09:33,160 Speaker 1: the problem. It's like that's not necessarily the kind of 156 00:09:33,240 --> 00:09:38,520 Speaker 1: feeling that Massa Hiro Morty was talking about. He was like, yeah, yeah, 157 00:09:38,559 --> 00:09:42,520 Speaker 1: you're interested in robotics and robotic arms and the industrial processes, 158 00:09:42,559 --> 00:09:45,559 Speaker 1: and you love watching how it's made. Right, what he 159 00:09:45,640 --> 00:09:49,240 Speaker 1: was talking about was more like how it resembles a 160 00:09:49,360 --> 00:09:52,360 Speaker 1: human and then how it makes you feel in relation 161 00:09:52,440 --> 00:09:55,840 Speaker 1: to its resemblance of a human. Right. Well, in that case, 162 00:09:55,880 --> 00:09:57,720 Speaker 1: it makes me feel nothing because it doesn't look at 163 00:09:57,720 --> 00:09:59,560 Speaker 1: all like a human, right, Okay, so that would be 164 00:09:59,600 --> 00:10:02,160 Speaker 1: at about the origin of the graph. It has no 165 00:10:02,240 --> 00:10:05,800 Speaker 1: resemblance to a human really, and it's not eliciting any 166 00:10:05,880 --> 00:10:09,640 Speaker 1: real affinity in you at all as far as it 167 00:10:09,720 --> 00:10:12,760 Speaker 1: looking like a human, right, But lots of affinity as 168 00:10:12,760 --> 00:10:16,640 Speaker 1: a thing that's just that's called props. So you go 169 00:10:16,679 --> 00:10:18,720 Speaker 1: a little bit further on the graph and then you 170 00:10:18,800 --> 00:10:23,160 Speaker 1: have things like um little stuffed animals and I No. 171 00:10:23,320 --> 00:10:26,640 Speaker 1: C three Po is a common one that's mentioned because 172 00:10:27,120 --> 00:10:30,480 Speaker 1: C three Pou, you know, is built to look like 173 00:10:30,520 --> 00:10:33,000 Speaker 1: a human. He does a great robot kind of it, 174 00:10:33,120 --> 00:10:36,240 Speaker 1: talks like a human and acts like a human. But 175 00:10:36,320 --> 00:10:38,080 Speaker 1: when it comes to that face, and as we'll see, 176 00:10:38,120 --> 00:10:39,480 Speaker 1: the face is kind of the key to all this. 177 00:10:40,679 --> 00:10:43,120 Speaker 1: For the most part, C three PO looks nothing like 178 00:10:43,160 --> 00:10:45,840 Speaker 1: a human in the face. So everything's still good and 179 00:10:45,920 --> 00:10:48,800 Speaker 1: people love C three PO. Right, So if you're looking 180 00:10:48,840 --> 00:10:51,640 Speaker 1: at the graph, C three PO is going up in 181 00:10:51,760 --> 00:10:54,200 Speaker 1: human likeness because he kind of you know, he's got 182 00:10:54,240 --> 00:10:59,560 Speaker 1: some some commonality there, and we're feeling affinity for him 183 00:11:00,200 --> 00:11:04,080 Speaker 1: on that human likeness. So it's he's going up. Yes, Okay, 184 00:11:04,840 --> 00:11:07,840 Speaker 1: we're going everything's going pretty well so far, right, Chuck, 185 00:11:08,400 --> 00:11:10,959 Speaker 1: that's right. Okay, So then we're gonna start hitting some 186 00:11:11,120 --> 00:11:14,400 Speaker 1: areas where things start looking a little more human, a 187 00:11:14,440 --> 00:11:16,680 Speaker 1: lot more human, I would say than c three po 188 00:11:16,800 --> 00:11:21,800 Speaker 1: like say, the characters in Moanna or Frozen picks our 189 00:11:21,920 --> 00:11:25,560 Speaker 1: characters that kind of thing where they look like they're 190 00:11:25,600 --> 00:11:28,480 Speaker 1: supposed to be human, like they're based on human, but 191 00:11:28,520 --> 00:11:31,880 Speaker 1: they have very exaggerated features that you would never confuse 192 00:11:31,960 --> 00:11:35,360 Speaker 1: at first glance for an actual human. Right, So they 193 00:11:35,360 --> 00:11:39,480 Speaker 1: have like big guys, small noses, things that make them cute, right, 194 00:11:40,200 --> 00:11:43,280 Speaker 1: And so our affinity for them is going up as 195 00:11:43,320 --> 00:11:46,080 Speaker 1: the human likeness is going up. Again, things are going 196 00:11:46,120 --> 00:11:49,240 Speaker 1: really well so far, that's right, because in Mohanna and 197 00:11:49,280 --> 00:11:51,640 Speaker 1: Frozen they look a little bit more like people, and 198 00:11:51,679 --> 00:11:53,760 Speaker 1: we like them a lot more for that reason. Right. 199 00:11:54,000 --> 00:11:57,960 Speaker 1: And then, like you said earlier, out of nowhere, the 200 00:11:58,080 --> 00:12:01,040 Speaker 1: whole thing, this line that's just been going up very pleasantly, 201 00:12:01,840 --> 00:12:06,360 Speaker 1: and a nice little slope just drops downward. Right, and 202 00:12:06,360 --> 00:12:10,000 Speaker 1: it doesn't drop just downward, it goes actually below the 203 00:12:10,280 --> 00:12:14,600 Speaker 1: X axis into negative territory. And now this is the 204 00:12:14,720 --> 00:12:18,600 Speaker 1: uncanny valley, that's right, And that's why it has that name, 205 00:12:18,679 --> 00:12:21,520 Speaker 1: because it's a valley, right, and this is where those 206 00:12:21,559 --> 00:12:28,600 Speaker 1: things like really really lifelike androids live, or corpses live, 207 00:12:29,160 --> 00:12:34,800 Speaker 1: or zombies live. Because Maury he had the idea that 208 00:12:35,080 --> 00:12:39,880 Speaker 1: if something's moving, is even creepier than something similar to 209 00:12:39,920 --> 00:12:42,720 Speaker 1: it that's not moving. So he actually created two lines 210 00:12:42,760 --> 00:12:45,840 Speaker 1: on this graph, one for things that are animate and 211 00:12:45,840 --> 00:12:48,440 Speaker 1: one for things that are inanimate. So if you look 212 00:12:48,480 --> 00:12:52,200 Speaker 1: at this Uncanny Valley, on the inanimate line, the non 213 00:12:52,280 --> 00:12:55,480 Speaker 1: moving line, you've got corpses are at the bottom of it. 214 00:12:56,040 --> 00:12:58,960 Speaker 1: But if you look at the animate line, it's even 215 00:12:59,280 --> 00:13:02,120 Speaker 1: it dips even further below than the inanimate line, and 216 00:13:02,240 --> 00:13:04,840 Speaker 1: at the bottom of those are zombies. So dead people 217 00:13:05,240 --> 00:13:09,199 Speaker 1: up and moving around and saying brains is as creepy 218 00:13:09,200 --> 00:13:12,080 Speaker 1: as it gets as far as this graph is concerned. Yeah, 219 00:13:12,120 --> 00:13:15,360 Speaker 1: and Maria wasn't the only one that earns yinch that 220 00:13:15,440 --> 00:13:19,600 Speaker 1: we talked about the German psychiatrists. He also talked about 221 00:13:19,640 --> 00:13:22,120 Speaker 1: the fact that if you are looking at something that 222 00:13:22,120 --> 00:13:25,040 Speaker 1: should not be moving, and it moves. I mean, I 223 00:13:25,040 --> 00:13:27,920 Speaker 1: think we can all agree that a baby doll that 224 00:13:28,040 --> 00:13:31,080 Speaker 1: suddenly turns its head and looks at you, right, probably 225 00:13:31,080 --> 00:13:34,120 Speaker 1: one of the creepier things you can witness, right, you know, Yeah, 226 00:13:34,160 --> 00:13:36,600 Speaker 1: it's about as creepy as it gets. Or have you 227 00:13:36,640 --> 00:13:41,920 Speaker 1: ever been to an open casket funeral? A few? I'm 228 00:13:42,320 --> 00:13:46,800 Speaker 1: not a fan at all. No, it is it makes sense. 229 00:13:47,000 --> 00:13:49,800 Speaker 1: You know, we've really kind of closed or put a 230 00:13:49,840 --> 00:13:52,400 Speaker 1: lot of space in between us and death, way more 231 00:13:52,400 --> 00:13:54,679 Speaker 1: than we used to have in like the nineteenth century. Oh, 232 00:13:54,800 --> 00:13:57,160 Speaker 1: they either would sit up with the dead, sure, right, 233 00:13:57,200 --> 00:13:59,160 Speaker 1: so this seems to be like kind of a holdover 234 00:13:59,280 --> 00:14:02,760 Speaker 1: from that. But if you've ever been to an open 235 00:14:02,800 --> 00:14:06,680 Speaker 1: casket funeral and have just stared at the corpse long enough, 236 00:14:06,720 --> 00:14:09,720 Speaker 1: like maybe it's arm or it's fingers or something, your 237 00:14:09,760 --> 00:14:12,720 Speaker 1: brain is so anticipating that they it's about to start 238 00:14:12,760 --> 00:14:15,680 Speaker 1: moving that sometimes you can creep yourself out and make 239 00:14:16,040 --> 00:14:18,720 Speaker 1: yourself think you did actually see it move. You'll also 240 00:14:18,760 --> 00:14:21,000 Speaker 1: be asked to leave the funeral. Well you shouldn't be 241 00:14:21,040 --> 00:14:24,040 Speaker 1: like giving a commentary about this out loud, but you can, 242 00:14:24,240 --> 00:14:26,240 Speaker 1: you know, you can do it to pass the time 243 00:14:27,200 --> 00:14:30,840 Speaker 1: in the funeral if you're looking to kill some time. 244 00:14:31,840 --> 00:14:34,640 Speaker 1: So here's the thing with all this. We know this 245 00:14:34,680 --> 00:14:38,080 Speaker 1: happens because everyone kind of has this feeling, but no 246 00:14:38,120 --> 00:14:40,120 Speaker 1: one and all this research has been done, and no 247 00:14:40,160 --> 00:14:44,640 Speaker 1: one is exactly sure why this happens. So, uh, Maury's 248 00:14:44,760 --> 00:14:47,920 Speaker 1: essay and especially once it was translated, a lot of 249 00:14:47,960 --> 00:14:53,320 Speaker 1: research started happening in this area. And it's problematic though, 250 00:14:53,360 --> 00:14:57,800 Speaker 1: because there are a few different problems. One is, well, 251 00:14:57,840 --> 00:15:01,560 Speaker 1: it's it's subjective. This dependent variable, whether you have an 252 00:15:01,560 --> 00:15:05,000 Speaker 1: affinity for something is very subjective, so it's hard to 253 00:15:05,040 --> 00:15:08,840 Speaker 1: kind of nail that down scientifically, right, all right, So 254 00:15:08,880 --> 00:15:12,720 Speaker 1: then number two is human likeness, right, this is the 255 00:15:12,840 --> 00:15:19,680 Speaker 1: independent variable. Yeah, and if you have human likeness, like, 256 00:15:19,920 --> 00:15:22,280 Speaker 1: what does that mean? Like what looks human? What doesn't 257 00:15:22,320 --> 00:15:25,720 Speaker 1: look human? The like, we haven't pinned that down. So like, 258 00:15:25,760 --> 00:15:28,600 Speaker 1: if you can't pin the dependent variable down and the 259 00:15:28,640 --> 00:15:32,000 Speaker 1: independent variable down, it makes it really tough to study correct. 260 00:15:32,680 --> 00:15:35,160 Speaker 1: And then there's a third one too. I love this one. Yeah. 261 00:15:35,160 --> 00:15:38,320 Speaker 1: The third one is you know, the original hypothesis. It 262 00:15:38,360 --> 00:15:43,560 Speaker 1: doesn't have a mathematical model that really specifies the shape 263 00:15:43,560 --> 00:15:48,680 Speaker 1: of this curve. Right, So it's still hypothetical, I guess, right, 264 00:15:49,080 --> 00:15:52,400 Speaker 1: which means that so if you look at Mori's graph, 265 00:15:52,480 --> 00:15:54,720 Speaker 1: it was he just basically made a line, right. It 266 00:15:54,760 --> 00:15:57,480 Speaker 1: wasn't based on any studies he'd done. The whole thing 267 00:15:57,560 --> 00:16:02,720 Speaker 1: was really an essay more than anything else. So researchers 268 00:16:02,720 --> 00:16:06,640 Speaker 1: who are trying to seriously study the scientifically have nothing 269 00:16:07,120 --> 00:16:11,320 Speaker 1: that they're actually trying to place their findings against, which 270 00:16:11,440 --> 00:16:14,600 Speaker 1: leads to put it puts them at risk for what's 271 00:16:14,600 --> 00:16:18,680 Speaker 1: called the Texas sharpshooter fallacy. Yeah, greatest name fallacy around, 272 00:16:19,120 --> 00:16:22,720 Speaker 1: and it's based on the idea that if you take 273 00:16:22,960 --> 00:16:25,120 Speaker 1: a sharpshooter out in Texas and have them shoot at 274 00:16:25,120 --> 00:16:27,520 Speaker 1: the side of a barn a bunch of times, some 275 00:16:27,560 --> 00:16:30,520 Speaker 1: of them are inevitably going to hit the barn, and 276 00:16:30,560 --> 00:16:33,080 Speaker 1: then the Texas sharpshooter walks up and then draws the 277 00:16:33,080 --> 00:16:36,680 Speaker 1: bull's eye around the bullets that he already sunk into 278 00:16:36,680 --> 00:16:39,240 Speaker 1: the side of the barn. That's the Texas sharpshooter fallacy. 279 00:16:39,280 --> 00:16:41,920 Speaker 1: It's ignoring data like the ones where he missed the 280 00:16:41,960 --> 00:16:45,040 Speaker 1: barn in favor of ones that fall into what you're 281 00:16:45,080 --> 00:16:48,280 Speaker 1: looking for, the bullet holes in the barn. You could 282 00:16:48,280 --> 00:16:52,320 Speaker 1: do the same thing with the data that you get 283 00:16:52,400 --> 00:16:55,280 Speaker 1: from testing the Uncanny Valley if you have no model 284 00:16:55,320 --> 00:16:58,400 Speaker 1: to fit it into already. Yeah, I think they would 285 00:16:58,400 --> 00:17:00,400 Speaker 1: have done better if they would have just instead of 286 00:17:00,440 --> 00:17:05,680 Speaker 1: trying to prove something, to maybe just research and call 287 00:17:05,720 --> 00:17:09,120 Speaker 1: it a thought experiment, right, you know, right, But people 288 00:17:09,160 --> 00:17:11,520 Speaker 1: are taking it seriously, and we'll we'll talk about some 289 00:17:11,560 --> 00:17:40,359 Speaker 1: of this research right after this, Chuck, all right, So 290 00:17:40,400 --> 00:17:45,399 Speaker 1: we're back. And despite the fact that this is really 291 00:17:45,440 --> 00:17:48,000 Speaker 1: tough to study, it's not even established that it's a 292 00:17:48,040 --> 00:17:52,160 Speaker 1: real thing in everyone's mind. By the way, there are 293 00:17:52,320 --> 00:17:56,600 Speaker 1: people out there who are really studying the Uncanny Valley 294 00:17:56,600 --> 00:17:59,040 Speaker 1: and trying to pin it down. Yeah. One of these 295 00:17:59,080 --> 00:18:02,480 Speaker 1: people is at Dartmouth College psychologist. And I didn't look 296 00:18:02,560 --> 00:18:11,240 Speaker 1: up their mascot, the pub Darts, the fighting pub darts 297 00:18:11,240 --> 00:18:13,959 Speaker 1: at Dartmouth College. We're gonna hear from Dartmouth. But her 298 00:18:14,000 --> 00:18:17,280 Speaker 1: name is Talia Wheatley, and she's done some research and 299 00:18:17,320 --> 00:18:21,360 Speaker 1: has found that it's not just like some uniquely Western 300 00:18:21,440 --> 00:18:24,320 Speaker 1: thing or American thing. It's kind of all over the world. 301 00:18:24,440 --> 00:18:28,400 Speaker 1: She studied tribes in Cambodia and they have these same 302 00:18:28,440 --> 00:18:32,240 Speaker 1: sensitivities to these things that look human but aren't human. 303 00:18:33,040 --> 00:18:35,560 Speaker 1: And they've even found that and I think it kind 304 00:18:35,560 --> 00:18:37,960 Speaker 1: of all comes down to the eyes. But they found 305 00:18:37,960 --> 00:18:42,680 Speaker 1: just looking at the eye can be enough. Yeah, somebody 306 00:18:42,720 --> 00:18:45,280 Speaker 1: can tell whether it's a human or not just looking 307 00:18:45,320 --> 00:18:47,480 Speaker 1: at a picture of the eye, right. Yeah. And that's 308 00:18:47,480 --> 00:18:52,560 Speaker 1: where I think people lose credibility in And we'll talk 309 00:18:52,600 --> 00:18:55,439 Speaker 1: about movies and sculpture and all that stuff, but they 310 00:18:55,520 --> 00:18:57,959 Speaker 1: just never get you can't get the eyes right, Like 311 00:18:58,000 --> 00:19:02,840 Speaker 1: you can't put life in lifeless eyes. Yeah, hard, they try. 312 00:19:03,200 --> 00:19:07,800 Speaker 1: Only God can u. And there was this other experiment 313 00:19:07,840 --> 00:19:11,000 Speaker 1: where they, you know, like where you can morph af 314 00:19:11,000 --> 00:19:14,520 Speaker 1: face digitally or whatever like that Michael Jackson Black or 315 00:19:14,560 --> 00:19:17,240 Speaker 1: White video. Yeah. I think some people were creeped out 316 00:19:17,240 --> 00:19:20,240 Speaker 1: by that even sure, but they would show this doll 317 00:19:20,400 --> 00:19:23,280 Speaker 1: image and it would morph into a human face, and 318 00:19:23,359 --> 00:19:26,720 Speaker 1: basically they would have people mark where they where they 319 00:19:26,720 --> 00:19:29,960 Speaker 1: thought that it would look more human than doll and 320 00:19:30,040 --> 00:19:32,760 Speaker 1: it you know, it landed about the sixty five percent 321 00:19:33,080 --> 00:19:36,240 Speaker 1: mark as far as morphing into human, which I mean 322 00:19:36,280 --> 00:19:40,280 Speaker 1: you can't really apply that necessarily, but just it's interesting offhand. 323 00:19:41,240 --> 00:19:45,520 Speaker 1: Sixty five point is about where the Uncanny Valley happened 324 00:19:45,760 --> 00:19:48,280 Speaker 1: in Maury's mind. Yeah, I would think it would be 325 00:19:48,480 --> 00:19:52,360 Speaker 1: higher than that. Yeah, but um yeah, it's still super interesting. 326 00:19:53,040 --> 00:19:56,480 Speaker 1: Um and you were saying that the eyes, and that's 327 00:19:56,520 --> 00:19:58,240 Speaker 1: what you're betting on, is that it's going to turn 328 00:19:58,240 --> 00:20:00,879 Speaker 1: out to be the eyes, right. Yeah. So trying to 329 00:20:00,960 --> 00:20:06,360 Speaker 1: investigate what constitutes human likeness, there's a researcher named Angela 330 00:20:06,400 --> 00:20:10,160 Speaker 1: Tinwell and she basically says, like, yes, it's all about 331 00:20:10,160 --> 00:20:15,320 Speaker 1: the upper facial features and that we detect those, we 332 00:20:16,200 --> 00:20:19,400 Speaker 1: read those. And so if there's any anything that's even 333 00:20:19,440 --> 00:20:22,679 Speaker 1: just slightly off in like you know, the eyes or 334 00:20:22,720 --> 00:20:27,840 Speaker 1: the brows or the wrinkles that form, that will lead 335 00:20:27,880 --> 00:20:31,639 Speaker 1: to the Uncanny Valley. That's the creeping part or the 336 00:20:31,720 --> 00:20:34,320 Speaker 1: smile too. She also says, well, yeah, and all these 337 00:20:34,320 --> 00:20:36,560 Speaker 1: things kind of come down to evolution. And her point 338 00:20:36,680 --> 00:20:39,879 Speaker 1: is like, you can't battle millions of years of evolution 339 00:20:40,480 --> 00:20:44,000 Speaker 1: that has honed our our dumb, little human brain to 340 00:20:44,119 --> 00:20:48,639 Speaker 1: detect something that's off about a face, right, it's just 341 00:20:48,680 --> 00:20:53,480 Speaker 1: too much to overcome, basically, Right. This other researcher named 342 00:20:54,320 --> 00:20:59,439 Speaker 1: Carl F. McDorman, who's from the University of Indiana, who actually, 343 00:21:00,200 --> 00:21:04,240 Speaker 1: he's basically like dedicated his career to this. Now. He 344 00:21:04,400 --> 00:21:09,560 Speaker 1: found that certain kinds of people, if you do like 345 00:21:09,600 --> 00:21:14,560 Speaker 1: a personality inventory before testing for Uncanny Valley sensitivity, yeah, 346 00:21:15,080 --> 00:21:19,119 Speaker 1: some types of people are predictably more sensitive to the 347 00:21:19,200 --> 00:21:23,720 Speaker 1: Uncanny Valley than others. Specifically, he found that very religious 348 00:21:23,760 --> 00:21:29,840 Speaker 1: people that makes total sense. Yeah, Neurotic people, yeah, and 349 00:21:30,520 --> 00:21:36,400 Speaker 1: people with high sensitivity to animal reminders. It's basically anything 350 00:21:36,800 --> 00:21:40,760 Speaker 1: that reminds you that, hey, you're super civilized and you 351 00:21:40,880 --> 00:21:43,959 Speaker 1: drive a car and you know how to play poker, um, 352 00:21:44,160 --> 00:21:46,520 Speaker 1: but you're still an animal, just as much as that 353 00:21:46,600 --> 00:21:50,879 Speaker 1: ape over there is an animal reminder. People who are 354 00:21:50,880 --> 00:21:53,680 Speaker 1: sensitive to that kind of thing tend to go off 355 00:21:53,720 --> 00:21:56,040 Speaker 1: on the Uncanny Valley as well. And then people who 356 00:21:56,160 --> 00:22:00,320 Speaker 1: are anxious are more likely to experience the Uncanny Valley 357 00:22:00,320 --> 00:22:03,280 Speaker 1: as far as McDorman is concerned. Yeah, that anteresting makes 358 00:22:03,280 --> 00:22:06,399 Speaker 1: sense too, because they're probably just more prone to be 359 00:22:08,320 --> 00:22:10,040 Speaker 1: I don't know, just have a reaction to a lot 360 00:22:10,080 --> 00:22:12,640 Speaker 1: of things like this. Right. But but we should say 361 00:22:13,400 --> 00:22:16,399 Speaker 1: the science and all this, the fact that the independent 362 00:22:16,400 --> 00:22:20,520 Speaker 1: and the dependent variable are still not defined. The science 363 00:22:20,640 --> 00:22:23,960 Speaker 1: is this, This is like the scientific equivalent of that 364 00:22:24,080 --> 00:22:28,320 Speaker 1: backward over the head half court basketball shot. Yeah, that's 365 00:22:28,359 --> 00:22:30,600 Speaker 1: the level of science that these people are carrying out 366 00:22:30,640 --> 00:22:33,880 Speaker 1: at this point, because there they're a lot of them 367 00:22:33,920 --> 00:22:38,480 Speaker 1: sadly are conducting experiments based on something that again doesn't 368 00:22:38,480 --> 00:22:42,320 Speaker 1: have a set, dependent or independent variable. So how can 369 00:22:42,359 --> 00:22:44,800 Speaker 1: you do that? As my question, Well, yeah, I mean 370 00:22:44,800 --> 00:22:50,520 Speaker 1: because in each experiment, they're going to be using different stimuli, 371 00:22:51,600 --> 00:22:55,879 Speaker 1: different faces, whether it's a doll or a wax figure 372 00:22:56,119 --> 00:22:59,440 Speaker 1: or a CGI character, and then they're gonna be doing 373 00:22:59,480 --> 00:23:02,440 Speaker 1: different things and have different expressions, and each person has 374 00:23:02,480 --> 00:23:04,960 Speaker 1: their own subjective take, so it is a very tough 375 00:23:05,000 --> 00:23:07,160 Speaker 1: thing to kind of nail down. Yeah, and I think 376 00:23:07,200 --> 00:23:09,760 Speaker 1: some of them are actually trying to form the basis 377 00:23:09,880 --> 00:23:14,480 Speaker 1: of this field of study right now, they're doing the groundwork. 378 00:23:14,880 --> 00:23:16,879 Speaker 1: But I think some of them also are like just 379 00:23:16,960 --> 00:23:19,800 Speaker 1: chasing headlines, Like there's no better way to get get 380 00:23:19,840 --> 00:23:23,439 Speaker 1: into the media cycle with your study than releasing some 381 00:23:23,520 --> 00:23:27,399 Speaker 1: findings on the uncanny Valley. People just tied up. Sure 382 00:23:28,680 --> 00:23:31,320 Speaker 1: they love it. One thing I thought was interesting was 383 00:23:31,840 --> 00:23:34,880 Speaker 1: at Princeton they tried this with monkeys and they found 384 00:23:34,920 --> 00:23:38,160 Speaker 1: the same thing happen when they had these realistic looking 385 00:23:38,200 --> 00:23:42,399 Speaker 1: but fake monkey faces. The monkeys were like ah and 386 00:23:42,480 --> 00:23:46,680 Speaker 1: turned away. It did make me think, though, like all 387 00:23:46,720 --> 00:23:50,720 Speaker 1: the you've seen these situations where like an orphaned animal 388 00:23:50,760 --> 00:23:54,639 Speaker 1: has a creepy puppet mother. Yeah, I know exactly what 389 00:23:54,640 --> 00:23:57,760 Speaker 1: you're talking about, and they seem to like that. But however, 390 00:23:57,880 --> 00:24:01,000 Speaker 1: and this is a bit of a spoiler, but towards 391 00:24:01,000 --> 00:24:03,840 Speaker 1: the end of this article, it points out that human 392 00:24:03,880 --> 00:24:06,399 Speaker 1: babies don't have this reaction at first either, and that 393 00:24:06,720 --> 00:24:10,400 Speaker 1: it's kind of learned. So maybe that explains it. Maybe 394 00:24:10,400 --> 00:24:14,280 Speaker 1: with the animals. I know you're talking about that that cage, 395 00:24:14,320 --> 00:24:18,480 Speaker 1: like why or monkey mother, it's super creepy. It's a 396 00:24:18,480 --> 00:24:20,560 Speaker 1: black and white photo. Well, no, I mean they do it. 397 00:24:20,600 --> 00:24:22,639 Speaker 1: There's all kinds of animals. Well, they'll they have like 398 00:24:22,680 --> 00:24:26,600 Speaker 1: a fake tiger or a fake duck or whatever, just 399 00:24:26,640 --> 00:24:28,760 Speaker 1: so the animal will feed or I mean it's usually 400 00:24:28,760 --> 00:24:30,840 Speaker 1: an animal that that milks from the mother. I guess 401 00:24:31,080 --> 00:24:33,959 Speaker 1: I see. But um, it's a common thing they do 402 00:24:34,040 --> 00:24:39,720 Speaker 1: for orphaned milk feeding or breastfeed animals. And they're always creepy. Huh. Well, 403 00:24:39,760 --> 00:24:42,879 Speaker 1: I mean to us, yeah, but to a dumb baby monkey, 404 00:24:42,920 --> 00:24:47,240 Speaker 1: they're just like sweet, give me the teat yeah, there's 405 00:24:47,280 --> 00:24:52,280 Speaker 1: a T shirt, maybe even a band name sweet give 406 00:24:52,320 --> 00:24:55,280 Speaker 1: Me the Teat. Yeah, yeah, this kind of falls into 407 00:24:55,320 --> 00:24:59,359 Speaker 1: the long band name category. Sure. But here's the thing is, 408 00:24:59,400 --> 00:25:03,960 Speaker 1: not everyone agrees with this whole thing. Like you said earlier, 409 00:25:04,000 --> 00:25:07,119 Speaker 1: There's a man named David Hanson and he's a roboticist 410 00:25:07,200 --> 00:25:11,720 Speaker 1: as well in Plano, Texas, and he did a very 411 00:25:11,880 --> 00:25:16,320 Speaker 1: very basic study. It was a survey where they showed 412 00:25:16,640 --> 00:25:20,639 Speaker 1: images of two different robots that were animated to simulate 413 00:25:20,880 --> 00:25:24,439 Speaker 1: human facial expressions and basically just asked, hey, what do 414 00:25:24,480 --> 00:25:30,119 Speaker 1: you think of this? And seventy percent said I like them? Yeah, 415 00:25:30,440 --> 00:25:31,920 Speaker 1: I can you see why people had trouble with this 416 00:25:32,000 --> 00:25:34,399 Speaker 1: study though? Yeah he said not one person said they 417 00:25:34,440 --> 00:25:40,720 Speaker 1: were disturbed. Yeah, okay, sounds good for the most part, though, 418 00:25:40,800 --> 00:25:44,280 Speaker 1: Studies into the uncanny valley or like now we're finding 419 00:25:44,359 --> 00:25:48,480 Speaker 1: something here. Yeah, although we should be suspicious of ones 420 00:25:48,560 --> 00:25:53,760 Speaker 1: that basically show the uncanny valley that Morey just graphed 421 00:25:53,920 --> 00:25:57,159 Speaker 1: out of his like with freehand, right, Like if you 422 00:25:57,320 --> 00:25:59,720 Speaker 1: if you've come across the study that shows that same thing, 423 00:26:00,240 --> 00:26:02,840 Speaker 1: they're probably cherry picking data. Are you about to say 424 00:26:02,880 --> 00:26:07,800 Speaker 1: out of his butt? Maybe there was another study Edward 425 00:26:07,840 --> 00:26:12,879 Speaker 1: Schneider at Sunny Potsdam in New York. I bet they 426 00:26:12,880 --> 00:26:16,920 Speaker 1: don't even have a mascot. They got together seventy five 427 00:26:17,160 --> 00:26:23,040 Speaker 1: characters from cartoons and video games, everyone from Mickey Mouse 428 00:26:23,119 --> 00:26:28,639 Speaker 1: to Laura Croft and who is very attractive by the way, 429 00:26:29,720 --> 00:26:35,240 Speaker 1: she's a computer character. Yeah. Well no, I'm talking about 430 00:26:35,280 --> 00:26:38,560 Speaker 1: playing tomb Raider. Yeah. I never played. Yeah when it 431 00:26:38,600 --> 00:26:40,520 Speaker 1: first came out, you know, I played tomb Raider and 432 00:26:40,520 --> 00:26:42,919 Speaker 1: I was like, oh, I look at her. Laura Croft 433 00:26:43,040 --> 00:26:46,200 Speaker 1: is kind of hut she well, she's she gets a 434 00:26:46,240 --> 00:26:49,960 Speaker 1: lot of stuff done. That's very attractive. I'm not true 435 00:26:50,000 --> 00:26:53,520 Speaker 1: what that means. Well, she travels a lot. Oh, she's 436 00:26:53,520 --> 00:26:55,600 Speaker 1: an independent person. Yeah, that's what I meant. I was 437 00:26:55,640 --> 00:26:59,040 Speaker 1: attracted to her mind in her adventures. Right. So anyway, 438 00:26:59,040 --> 00:27:02,399 Speaker 1: they ask people in this study, um, how attractive do 439 00:27:02,440 --> 00:27:05,000 Speaker 1: you think these characters are? Or how repulsive do you 440 00:27:05,040 --> 00:27:09,159 Speaker 1: think they are? And again there was um a graph 441 00:27:09,200 --> 00:27:11,920 Speaker 1: with a dip in it at a certain point, as 442 00:27:11,960 --> 00:27:19,240 Speaker 1: you would expect. Yep, careful, careful, everybody. So if you're 443 00:27:19,240 --> 00:27:22,160 Speaker 1: if you're a robot designer, right, One of the things 444 00:27:22,200 --> 00:27:25,360 Speaker 1: like even back in his essay written in nineteen seventy. 445 00:27:25,400 --> 00:27:31,840 Speaker 1: Massa Hiro Morey said, Um, there's there's problems here with movement. 446 00:27:31,960 --> 00:27:34,840 Speaker 1: There's problems here with the smile. It has something to 447 00:27:34,880 --> 00:27:39,440 Speaker 1: do with the face, right yeah, um, And somebody else 448 00:27:39,440 --> 00:27:41,480 Speaker 1: said I don't remember who it was, but there's there 449 00:27:41,520 --> 00:27:44,879 Speaker 1: always seems to be a lag time between how realistic 450 00:27:44,920 --> 00:27:49,879 Speaker 1: a designer can make a robot and how realistic an 451 00:27:49,920 --> 00:27:54,040 Speaker 1: engineer can make that robot look. Yeah, right, and that 452 00:27:54,160 --> 00:27:57,159 Speaker 1: that disconnected Maury's mind was a big part of the 453 00:27:57,320 --> 00:27:59,639 Speaker 1: Uncanny Valley, but he also seemed to focus on the 454 00:27:59,680 --> 00:28:04,160 Speaker 1: smile on the eyes and one of the things that's 455 00:28:04,160 --> 00:28:07,320 Speaker 1: at stake, like besides this just being like an interesting 456 00:28:08,119 --> 00:28:12,639 Speaker 1: topic of discussion, like there are actual real world implications 457 00:28:12,640 --> 00:28:15,760 Speaker 1: for this whole thing, right, Like, if you're a robot designer, 458 00:28:16,320 --> 00:28:19,480 Speaker 1: you want to create something that's not going to freak 459 00:28:19,520 --> 00:28:23,080 Speaker 1: people out because the whole purpose of robots is to 460 00:28:23,160 --> 00:28:26,760 Speaker 1: interact with humans, and you want them to interact with humans. 461 00:28:26,760 --> 00:28:30,840 Speaker 1: I should say life like looking at robots, right, because 462 00:28:30,880 --> 00:28:34,520 Speaker 1: like Ford Motor companies ever gonna buy an android that 463 00:28:34,560 --> 00:28:37,400 Speaker 1: looks human to just work on an assembly line when 464 00:28:37,440 --> 00:28:39,400 Speaker 1: they can get the same thing that does the same 465 00:28:39,520 --> 00:28:42,760 Speaker 1: job cheaper when it just looks like a robotic arm 466 00:28:42,840 --> 00:28:46,760 Speaker 1: or something. Right, the whole purpose of a lifelike looking 467 00:28:46,840 --> 00:28:49,600 Speaker 1: robot is because that robot is being designed to interact 468 00:28:49,600 --> 00:28:52,520 Speaker 1: with humans. And if you are going to run into 469 00:28:52,560 --> 00:28:55,440 Speaker 1: this spot, some people say it's not even a valley. 470 00:28:55,520 --> 00:28:58,240 Speaker 1: Some people think it's insurmountable, a cliff or a wall. 471 00:28:58,760 --> 00:29:01,240 Speaker 1: So if you're going to run up against this, you 472 00:29:01,280 --> 00:29:03,280 Speaker 1: want to figure out how to overcome it, because you 473 00:29:03,280 --> 00:29:06,280 Speaker 1: don't want to creep people out with your creations. Well, 474 00:29:06,320 --> 00:29:08,160 Speaker 1: and you don't want to spend a lot of money 475 00:29:09,360 --> 00:29:16,200 Speaker 1: to develop a robotic Walmart greeter at every store, because 476 00:29:16,280 --> 00:29:19,960 Speaker 1: it's it's happening like this is coming people. Yeah, there's 477 00:29:19,960 --> 00:29:23,960 Speaker 1: a robot called Geminoid F or Actroid F, depending on 478 00:29:24,000 --> 00:29:26,760 Speaker 1: who you ask. I've also seen her called Ellie, and 479 00:29:26,840 --> 00:29:29,040 Speaker 1: she is out of this lab by a guy named 480 00:29:29,120 --> 00:29:34,440 Speaker 1: Hiroshi Ishiguro, and he is probably the world's leading roboticist. 481 00:29:34,520 --> 00:29:38,120 Speaker 1: If you've seen any life like android, it probably came 482 00:29:38,160 --> 00:29:41,719 Speaker 1: out of this guy's lab, right, Yeah, And she is 483 00:29:41,800 --> 00:29:43,960 Speaker 1: starting to get out there in the world. She's been 484 00:29:44,040 --> 00:29:48,560 Speaker 1: a debriefer of soldiers coming back from war with PTSD 485 00:29:49,160 --> 00:29:52,680 Speaker 1: based on the idea that they might share more with 486 00:29:52,720 --> 00:29:55,000 Speaker 1: a robot that they knew was just a robot than 487 00:29:55,040 --> 00:29:58,880 Speaker 1: they would an actual human. Yeah, she's in a play. 488 00:30:00,040 --> 00:30:04,560 Speaker 1: She stars as an android. Good role. Right. And then 489 00:30:04,600 --> 00:30:09,320 Speaker 1: there's Casper. There's a little robot called Casper. Yeah, Casper 490 00:30:09,520 --> 00:30:13,720 Speaker 1: is a robot boy with a great cause, created to 491 00:30:13,720 --> 00:30:18,240 Speaker 1: help children with autism learn to read facial emotions. If 492 00:30:18,280 --> 00:30:21,120 Speaker 1: you look up photos of both of these, geminoid f 493 00:30:22,520 --> 00:30:27,640 Speaker 1: looks great and really like Ishiguro is doing great, great work. 494 00:30:28,760 --> 00:30:33,960 Speaker 1: Casper looks terrifying, right, and so Casper is creepy. But 495 00:30:34,000 --> 00:30:37,040 Speaker 1: that's not his purpose at all, right now, of his 496 00:30:37,160 --> 00:30:42,000 Speaker 1: purpose is to like teach kids with autism how to connect. 497 00:30:42,400 --> 00:30:46,840 Speaker 1: But if he's repelling them through this uncanny valley, he's 498 00:30:46,840 --> 00:30:51,320 Speaker 1: defeating the purpose. Well, they should go to Ishiguro and say, hey, 499 00:30:51,560 --> 00:30:53,840 Speaker 1: we have this great cause, can you make us something 500 00:30:53,880 --> 00:30:57,400 Speaker 1: that doesn't look like the stuff of nightmares? Right exactly? 501 00:30:57,800 --> 00:31:01,720 Speaker 1: I wonder if Casper has been effective, you know, I 502 00:31:01,840 --> 00:31:05,360 Speaker 1: don't know. I don't know. Now I feel bad I 503 00:31:05,440 --> 00:31:08,840 Speaker 1: didn't look into that. Well, I just I don't know. 504 00:31:09,160 --> 00:31:12,600 Speaker 1: He's very creepy looking. I agree wholeheartedly. It's kind of like, no, 505 00:31:12,720 --> 00:31:16,560 Speaker 1: he's not finished, get back to the drawing board. Either 506 00:31:16,640 --> 00:31:19,840 Speaker 1: that or and this is what Moury said, like go 507 00:31:19,960 --> 00:31:24,320 Speaker 1: the other way, Like just make him not human at all, 508 00:31:24,680 --> 00:31:29,400 Speaker 1: just cute or approachable. Right, So the robotists are not 509 00:31:29,440 --> 00:31:32,160 Speaker 1: the only ones who are facing this chuck. There is 510 00:31:32,240 --> 00:31:37,680 Speaker 1: a pretty powerful moneyed contingent of people who are interested 511 00:31:37,760 --> 00:31:41,880 Speaker 1: stakeholders in overcoming the uncanny Valley, or at least figuring 512 00:31:41,920 --> 00:31:46,920 Speaker 1: out if it's totally insurmountable. And that is Hollywood. Yeah, 513 00:31:47,240 --> 00:31:51,280 Speaker 1: Hollywood has a sort of a rich history of getting 514 00:31:51,280 --> 00:31:57,880 Speaker 1: it wrong when it comes to creepy CGI characters. Pixarhad 515 00:31:57,880 --> 00:32:00,240 Speaker 1: their very first short film actually is called Tin Toy. 516 00:32:00,880 --> 00:32:03,120 Speaker 1: It's a little five minutes short, and they showed it 517 00:32:03,160 --> 00:32:06,160 Speaker 1: to this you know, this proceded toy story and everything. Yeah, 518 00:32:06,160 --> 00:32:08,640 Speaker 1: it was actually kind of like the outline of Toy 519 00:32:08,720 --> 00:32:11,880 Speaker 1: stories plot. Yeah, but they showed it to test audiences 520 00:32:11,920 --> 00:32:15,640 Speaker 1: and they made the mistake of making the baby Billy 521 00:32:16,320 --> 00:32:21,800 Speaker 1: look too realistic, and everyone loved Tin Toy and everyone 522 00:32:22,040 --> 00:32:25,800 Speaker 1: hated Billy. Yeah have you seen it? Yeah? Yeah, so 523 00:32:25,840 --> 00:32:28,920 Speaker 1: I hate Billy. Yeah, he's pretty hateable for sure, and 524 00:32:29,000 --> 00:32:32,520 Speaker 1: he has the antagonist, but he struck some chord with 525 00:32:32,640 --> 00:32:35,640 Speaker 1: viewers that that Pixar did not mean to strike, right, 526 00:32:35,960 --> 00:32:40,040 Speaker 1: and they actually, I mean, this is extraordinarily fortunate for Pixar. Sure, 527 00:32:40,200 --> 00:32:44,160 Speaker 1: this is very early on in their history, and they 528 00:32:45,560 --> 00:32:48,520 Speaker 1: learned from it. Actually they're like, Okay, note to self, 529 00:32:48,840 --> 00:32:50,960 Speaker 1: don't try to make any of these characters life like, 530 00:32:51,080 --> 00:32:53,920 Speaker 1: let's go a different direction. And so they came up 531 00:32:53,960 --> 00:32:59,600 Speaker 1: with those um exaggerated features that we've all just come 532 00:32:59,640 --> 00:33:03,200 Speaker 1: to know and love. Yeah, which was a great, great 533 00:33:03,240 --> 00:33:06,240 Speaker 1: direction to go in. Yeah, obviously, because they've had tons 534 00:33:06,280 --> 00:33:08,680 Speaker 1: of success with that model. Right, you can make the 535 00:33:08,720 --> 00:33:12,280 Speaker 1: case that it may have saved the company because other 536 00:33:12,360 --> 00:33:14,720 Speaker 1: companies and other movies, for sure, have not been nearly 537 00:33:14,720 --> 00:33:18,640 Speaker 1: as fortunate. Yeah. One of the first big photo real 538 00:33:19,040 --> 00:33:23,400 Speaker 1: computer animated movies was Final Fantasy Colon the Spirits Within. 539 00:33:23,960 --> 00:33:25,800 Speaker 1: You should never have a colon in your movie title, 540 00:33:25,840 --> 00:33:28,680 Speaker 1: by the way, So that was the first mistake. But 541 00:33:29,080 --> 00:33:32,120 Speaker 1: this one was from two thousand and one and based 542 00:33:32,160 --> 00:33:36,240 Speaker 1: on the video game, and it was off putting to 543 00:33:36,600 --> 00:33:38,280 Speaker 1: a lot of people, and it was a big, big 544 00:33:38,320 --> 00:33:42,600 Speaker 1: bomb for Columbia Pictures. And but this is before Uncanny 545 00:33:42,680 --> 00:33:46,920 Speaker 1: Valley had really been established, Before Maury's essay was translated, 546 00:33:47,000 --> 00:33:50,440 Speaker 1: so reviewers didn't quite know what to say. Now they 547 00:33:50,480 --> 00:33:53,320 Speaker 1: would just say, we've tumbled into the Uncanny Valley again, 548 00:33:54,200 --> 00:33:57,600 Speaker 1: but they would say things like Peter Traver's Great reviewer 549 00:33:57,640 --> 00:34:00,280 Speaker 1: from Rolling Stone said, at first, it's fun to watch 550 00:34:00,320 --> 00:34:06,120 Speaker 1: the characters ellipsis, ellips, ellips. But what's an ellipsis is 551 00:34:06,160 --> 00:34:09,000 Speaker 1: that two of them? A couple of them. But then 552 00:34:09,040 --> 00:34:11,720 Speaker 1: you notice a coldness in the eyes, a mechanical quality 553 00:34:11,719 --> 00:34:14,839 Speaker 1: in the movements, familiar voices emerging from the mouths of 554 00:34:14,880 --> 00:34:19,600 Speaker 1: replicants erect a distance. Yeah, so he's describing the Uncanny Valley. 555 00:34:19,600 --> 00:34:23,239 Speaker 1: He just didn't have the name of it yet. And 556 00:34:23,239 --> 00:34:25,840 Speaker 1: then a couple of years later you had the Polar Express, 557 00:34:26,880 --> 00:34:30,120 Speaker 1: which became I think even more famous than Final Fantasy 558 00:34:30,320 --> 00:34:33,760 Speaker 1: totally as far as the Uncanny Valley goes. But again, 559 00:34:33,800 --> 00:34:36,759 Speaker 1: it's like you said, you know, the reviewers didn't know 560 00:34:36,800 --> 00:34:39,760 Speaker 1: how quite to put their finger on it. And I'm 561 00:34:39,800 --> 00:34:42,520 Speaker 1: not quite sure how Final Fantasy was done, but I 562 00:34:42,560 --> 00:34:48,239 Speaker 1: know that Polar Express used similar software and hardware to 563 00:34:49,239 --> 00:34:54,160 Speaker 1: what roboticists are using now, where it's like motion capture, 564 00:34:54,280 --> 00:34:58,040 Speaker 1: but rather than translating the motion to the robot. It's 565 00:34:58,040 --> 00:35:01,680 Speaker 1: translating the motion into like a digital three D rendering 566 00:35:02,200 --> 00:35:06,720 Speaker 1: of the character. Right. Yeah, So Polar Express was really 567 00:35:07,440 --> 00:35:11,680 Speaker 1: really expressive, but not quite there. So it fell really 568 00:35:11,680 --> 00:35:16,200 Speaker 1: hard in the young Canny Valley. And I think David 569 00:35:16,239 --> 00:35:20,600 Speaker 1: Germaine of the Associated Press compared the kids in this 570 00:35:21,040 --> 00:35:24,799 Speaker 1: heartwarming family Christmas movie to the children from Village of 571 00:35:24,800 --> 00:35:27,919 Speaker 1: the Damned. Yeah, which is not what you want. It's 572 00:35:27,960 --> 00:35:30,200 Speaker 1: not at all what the studio wanted. And I think 573 00:35:30,400 --> 00:35:33,200 Speaker 1: it lost a pretty decent amount of money. Yeah, there 574 00:35:33,239 --> 00:35:35,839 Speaker 1: was another one, and these are all, by the way, 575 00:35:36,080 --> 00:35:40,120 Speaker 1: courtesy of Robert Zemecas he really had his He went 576 00:35:40,160 --> 00:35:44,160 Speaker 1: all in on this technology. I don't know why. I 577 00:35:44,200 --> 00:35:47,759 Speaker 1: think he just I think sometimes as an artist you 578 00:35:47,800 --> 00:35:51,719 Speaker 1: can get so wrapped up in the coolness of wow, 579 00:35:51,760 --> 00:35:54,560 Speaker 1: look what we can do now that you don't step 580 00:35:54,600 --> 00:35:58,040 Speaker 1: back and look at what you're doing. Look should should 581 00:35:58,080 --> 00:36:02,200 Speaker 1: we be doing this? Because he also had a part 582 00:36:02,320 --> 00:36:05,200 Speaker 1: in the Beowulf movie in two thousand and seven that 583 00:36:05,280 --> 00:36:07,919 Speaker 1: was a huge bomb in The New York Times said 584 00:36:07,920 --> 00:36:10,279 Speaker 1: this about that people who are meant to be enraged, 585 00:36:10,640 --> 00:36:12,879 Speaker 1: who are at risk of plummeting to their deaths, just 586 00:36:12,960 --> 00:36:15,040 Speaker 1: look a little out of sorts. When it was over, 587 00:36:15,160 --> 00:36:17,640 Speaker 1: I felt relieved to be back in the company of 588 00:36:17,800 --> 00:36:23,520 Speaker 1: uncreepy flesh and blood humans again sad. And then did 589 00:36:23,520 --> 00:36:26,800 Speaker 1: you see The Adventures of Tintin? Yeah, I really liked Tintin, 590 00:36:26,880 --> 00:36:30,360 Speaker 1: though I did too. I think Spielberg, I mean, there 591 00:36:30,480 --> 00:36:34,480 Speaker 1: is that uncanny valley a little bit, but the story 592 00:36:34,520 --> 00:36:37,600 Speaker 1: in the movie were so good he overcame that. I think. 593 00:36:37,880 --> 00:36:41,239 Speaker 1: I was about to say, I think Spielberg has come 594 00:36:41,280 --> 00:36:47,759 Speaker 1: the closest to overcoming that chasm of anybody. But did 595 00:36:47,760 --> 00:36:50,320 Speaker 1: he do it through good storytelling or through the eyes 596 00:36:50,600 --> 00:36:53,200 Speaker 1: I don't know. I don't know if it I don't know. 597 00:36:53,239 --> 00:36:56,799 Speaker 1: If it was a combination of the two. I don't know. 598 00:36:56,840 --> 00:37:02,040 Speaker 1: But it is extraordinarily, it's an extradit so you know, 599 00:37:02,080 --> 00:37:04,719 Speaker 1: those that that stuff you'll see every once a while, 600 00:37:04,719 --> 00:37:07,359 Speaker 1: wish somebody does like what Beavis and butt Head would 601 00:37:07,360 --> 00:37:10,239 Speaker 1: actually look like in real life, or what Charlie Brown 602 00:37:10,239 --> 00:37:13,400 Speaker 1: would look like in real life. Fantastic, right, So it 603 00:37:13,880 --> 00:37:17,319 Speaker 1: still has kind of got a cartoonish quality to it. 604 00:37:17,400 --> 00:37:20,319 Speaker 1: It's the same thing with the Tintin movie. But it 605 00:37:20,400 --> 00:37:23,200 Speaker 1: was like it was as if you were living in 606 00:37:23,200 --> 00:37:29,120 Speaker 1: a dimension where humans looked somewhat cartoonish. Is that making 607 00:37:29,160 --> 00:37:30,879 Speaker 1: any sense or does that just make the whole thing 608 00:37:30,880 --> 00:37:33,879 Speaker 1: even harder to understand? No? I get that. So so 609 00:37:33,960 --> 00:37:37,920 Speaker 1: he somehow was like, here, I'm not trying to nail 610 00:37:38,080 --> 00:37:40,520 Speaker 1: what humans look like. I'm going to take you to 611 00:37:40,560 --> 00:37:43,319 Speaker 1: another world where these people live. And if you lived 612 00:37:43,360 --> 00:37:45,319 Speaker 1: in this world, you would look like this too. It's 613 00:37:45,520 --> 00:37:48,239 Speaker 1: it's weird. It's like he bridged an uncanny valley that 614 00:37:48,280 --> 00:37:51,360 Speaker 1: doesn't exist in this dimension. Yeah, he built a temporary 615 00:37:51,840 --> 00:37:56,160 Speaker 1: disintegrating bridge across the uncanny valley, right. I think the 616 00:37:56,160 --> 00:37:58,600 Speaker 1: biggest example in recent years was, or the one that 617 00:37:58,640 --> 00:38:01,360 Speaker 1: got the most attention was in Rogue One. Did you 618 00:38:01,360 --> 00:38:03,919 Speaker 1: see that? The Star Wars movie. I haven't seen any 619 00:38:03,920 --> 00:38:06,160 Speaker 1: of the new Star Wars ones except for I've seen 620 00:38:06,280 --> 00:38:10,480 Speaker 1: the first six, I guess, but none of the two 621 00:38:10,640 --> 00:38:15,600 Speaker 1: new new ones. Well, in Rogue one, they completely bring 622 00:38:15,640 --> 00:38:19,759 Speaker 1: back to life Grand Moth Tarkin, who was played by 623 00:38:19,440 --> 00:38:23,960 Speaker 1: the deceased Peter Cushing, and they brought him back as 624 00:38:24,000 --> 00:38:28,239 Speaker 1: a character in this movie and in the theater like 625 00:38:28,320 --> 00:38:31,839 Speaker 1: when he when it first happens, he's got his back 626 00:38:31,880 --> 00:38:33,719 Speaker 1: to you, and it's sort of in the shadows and 627 00:38:33,719 --> 00:38:37,360 Speaker 1: you're like, oh wow, like that's pretty cool. And I 628 00:38:37,440 --> 00:38:40,600 Speaker 1: didn't know that they would do that, but they they 629 00:38:40,640 --> 00:38:45,440 Speaker 1: got too comfortable, I think, and showed too much and 630 00:38:45,719 --> 00:38:48,640 Speaker 1: gave him too many lines and too much light, and 631 00:38:48,800 --> 00:38:52,720 Speaker 1: then it became Uncanny Valley really. Oh yeah, for sure. 632 00:38:52,920 --> 00:38:56,480 Speaker 1: Think about poor Peter Cushing's family having to see that. Yeah, 633 00:38:56,560 --> 00:38:59,360 Speaker 1: I don't know how, but they just weep during that 634 00:38:59,600 --> 00:39:02,040 Speaker 1: that movie. Well, I'm curious about like life rights and 635 00:39:02,080 --> 00:39:03,719 Speaker 1: image rights and stuff like that, if they had to 636 00:39:04,239 --> 00:39:06,279 Speaker 1: get that cleared. I don't even know. I'm sure there's 637 00:39:06,280 --> 00:39:09,920 Speaker 1: a backstory there. Oh. Cushing was famously mellow. That was 638 00:39:09,960 --> 00:39:12,600 Speaker 1: he He would have taken and draw off his dubie 639 00:39:12,600 --> 00:39:15,319 Speaker 1: and been like that's whatever. Man. Yeah. I think he 640 00:39:15,360 --> 00:39:17,640 Speaker 1: spent the last year of his life on his weed 641 00:39:17,719 --> 00:39:22,840 Speaker 1: farm in northern California. Right, what about this Mars Needs Moms. 642 00:39:22,880 --> 00:39:25,080 Speaker 1: I had never ever heard of that movie, and so 643 00:39:25,719 --> 00:39:27,960 Speaker 1: I went and watched the trailer and I still was like, 644 00:39:28,120 --> 00:39:30,560 Speaker 1: I have no idea what this is. Yeah, you know 645 00:39:30,600 --> 00:39:34,319 Speaker 1: that comic strip bloom County. Well, you know, I'm a huge, 646 00:39:34,400 --> 00:39:37,239 Speaker 1: huge life along Bloom County fan. Oh okay, so Burke 647 00:39:37,719 --> 00:39:39,879 Speaker 1: so maybe you know how to say the last name. 648 00:39:40,360 --> 00:39:45,200 Speaker 1: It's Berkeley. Breathed or breathed. They said breathed, but I 649 00:39:45,239 --> 00:39:46,759 Speaker 1: don't know if I've ever heard it said out loud. 650 00:39:47,040 --> 00:39:50,239 Speaker 1: Breathed sounds nice. Let's go with that. So Berkeley breath 651 00:39:50,440 --> 00:39:53,759 Speaker 1: the person, the guy who did Bloom County. Yeah, he 652 00:39:53,800 --> 00:39:57,080 Speaker 1: wrote a book, a children's book called Mars Needs Moms, 653 00:39:57,360 --> 00:40:00,680 Speaker 1: And basically Mars had some sort of shortage of moms. 654 00:40:01,040 --> 00:40:04,160 Speaker 1: So the Martians came and kidnapped human moms and it 655 00:40:04,200 --> 00:40:06,040 Speaker 1: was up to the human kids to go get their 656 00:40:06,040 --> 00:40:09,200 Speaker 1: moms back from Mars. Right, pretty cute little premise, but 657 00:40:09,320 --> 00:40:12,160 Speaker 1: they took it and ran it through Zemeckis's nightmare mill 658 00:40:13,080 --> 00:40:17,840 Speaker 1: u right. Image Movers Digital was the was the trade 659 00:40:17,920 --> 00:40:20,160 Speaker 1: name of it, but everybody knew it's just steer clear 660 00:40:20,160 --> 00:40:22,440 Speaker 1: of this place, right, Yeah. And this was like the 661 00:40:22,560 --> 00:40:27,280 Speaker 1: Apex or the what's the opposite of the Apex? The Valley, 662 00:40:27,840 --> 00:40:31,600 Speaker 1: I guess, so the deepest part of the CGI Valley, 663 00:40:31,920 --> 00:40:36,080 Speaker 1: of the Uncanny Valley, right, it was what the stuff 664 00:40:36,120 --> 00:40:39,640 Speaker 1: that they created. It was so off and just so 665 00:40:39,680 --> 00:40:43,879 Speaker 1: spectacularly and colossally off that when I guess Disney came 666 00:40:43,920 --> 00:40:47,680 Speaker 1: along and bought this company. They came in, looked around 667 00:40:47,760 --> 00:40:51,200 Speaker 1: and said, we're shutting you down. This movie's that we're 668 00:40:51,200 --> 00:40:53,520 Speaker 1: not doing this anymore. What you guys are doing here 669 00:40:53,600 --> 00:40:57,239 Speaker 1: is wrong, and you're all going to jail. Yeah, here's 670 00:40:57,280 --> 00:41:00,120 Speaker 1: my thoughts on that. I watched the trailer and it 671 00:41:00,160 --> 00:41:02,279 Speaker 1: didn't look any worse than any of the other ones 672 00:41:02,320 --> 00:41:05,239 Speaker 1: to me. And in fact, I don't know the character's names, 673 00:41:05,239 --> 00:41:06,920 Speaker 1: but there's a kid and then there's this one kind 674 00:41:06,960 --> 00:41:10,160 Speaker 1: of chubby guy in mars. Yeah, the chubby guy looked 675 00:41:10,160 --> 00:41:13,440 Speaker 1: pretty good. Actually, I thought I think this was a victim. 676 00:41:13,520 --> 00:41:16,719 Speaker 1: I bet the movie sucked really bad. Yeah, and I 677 00:41:16,760 --> 00:41:18,839 Speaker 1: think it was the last straw at the end of 678 00:41:18,880 --> 00:41:22,839 Speaker 1: all these Uncanny Valley failures. Yeah, because this again, this 679 00:41:22,920 --> 00:41:27,759 Speaker 1: is the same company that had created a Polar Express. Yeah, 680 00:41:27,800 --> 00:41:30,799 Speaker 1: the Nightmare Factory and a Christmas Carol did not do 681 00:41:30,960 --> 00:41:34,040 Speaker 1: very well either. So yeah, I think it definitely bore 682 00:41:34,080 --> 00:41:37,480 Speaker 1: the brunt of its predecessors as well. Yeah, but I 683 00:41:37,520 --> 00:41:40,279 Speaker 1: thought this was as bad as it got. If you 684 00:41:40,280 --> 00:41:44,600 Speaker 1: ask me, I totally saw what Disney saw with this one. Yeah. 685 00:41:44,640 --> 00:41:47,960 Speaker 1: Anytime something is marked as the thing that killed the thing, right, 686 00:41:48,160 --> 00:41:52,560 Speaker 1: it's always just the last thing. Yeah, you're right, you know, yeah, anyway, 687 00:41:52,600 --> 00:41:54,440 Speaker 1: but it could have also been the thing that saved 688 00:41:54,440 --> 00:41:58,520 Speaker 1: the thing had they gotten it right. You know, that's true. 689 00:41:59,080 --> 00:42:01,840 Speaker 1: So like that, Maury was like, and every time I 690 00:42:01,920 --> 00:42:04,960 Speaker 1: say more now unless I say it like Mory just 691 00:42:05,040 --> 00:42:08,000 Speaker 1: saying morey Jewish guy, I think I think of the 692 00:42:08,040 --> 00:42:13,880 Speaker 1: Wig Salesman and Goodfellas. Yeahs like give me money and 693 00:42:14,080 --> 00:42:17,160 Speaker 1: ray leotis this in their laughing because More's two payfalls off. 694 00:42:17,920 --> 00:42:20,080 Speaker 1: Imagine that guy is the guy who came up with 695 00:42:20,120 --> 00:42:23,320 Speaker 1: the Uncanny Valley. Okay, he gives a whole different spin 696 00:42:23,400 --> 00:42:28,959 Speaker 1: to it, right, Yeah. So Moury says, um, just don't 697 00:42:29,000 --> 00:42:31,479 Speaker 1: even try. Guys like you're never going to be able 698 00:42:31,480 --> 00:42:34,080 Speaker 1: to do this, even if you can, We're so far 699 00:42:34,160 --> 00:42:36,560 Speaker 1: away from it. And this is in nineteen seventy he 700 00:42:36,600 --> 00:42:39,160 Speaker 1: was saying it, and it still holds true. Now, yeah, 701 00:42:39,320 --> 00:42:42,200 Speaker 1: we're so far away from this that that just maybe 702 00:42:43,040 --> 00:42:46,160 Speaker 1: put your put your emphasis elsewhere. And the example he 703 00:42:46,200 --> 00:42:49,040 Speaker 1: gave was say, like a prosthetic hand, right, yea, rather 704 00:42:49,080 --> 00:42:52,520 Speaker 1: than trying to create a lifelike prosthetic hand that that 705 00:42:52,719 --> 00:42:56,719 Speaker 1: was in danger of creeping people out, which is the 706 00:42:56,719 --> 00:43:00,239 Speaker 1: opposite of what somebody wearing a prosthetic hand wants when 707 00:43:00,239 --> 00:43:02,920 Speaker 1: they're walking around the prosthetic hand. He said, you know, 708 00:43:03,200 --> 00:43:07,920 Speaker 1: maybe choose some like like wood, well sanded, beautifully grained 709 00:43:07,960 --> 00:43:11,080 Speaker 1: wood in the shape of a human hand. It gets 710 00:43:11,080 --> 00:43:13,600 Speaker 1: the point across. This is my hand. I lost my hand. 711 00:43:13,600 --> 00:43:15,440 Speaker 1: I don't have my hand. But there's nothing to be 712 00:43:15,440 --> 00:43:18,279 Speaker 1: creeped out about here. It's kind of beautiful looking, isn't it. 713 00:43:18,760 --> 00:43:21,840 Speaker 1: That was Maury's take, and a lot of people side 714 00:43:21,840 --> 00:43:23,960 Speaker 1: with him as well. As a matter of fact, you know, 715 00:43:24,000 --> 00:43:26,000 Speaker 1: I said, I think at the beginning that he was 716 00:43:26,120 --> 00:43:30,440 Speaker 1: already an established roboticist when he wrote The Uncanny Valley 717 00:43:30,440 --> 00:43:33,319 Speaker 1: in nineteen seventy, and he went on to teach a 718 00:43:33,320 --> 00:43:37,240 Speaker 1: lot of people roboticists, or a lot of roboticists as well. 719 00:43:37,440 --> 00:43:43,360 Speaker 1: And that very famous robot Asimo Osimo, you know, the 720 00:43:43,400 --> 00:43:45,480 Speaker 1: one I'm talking about. He was one of the first 721 00:43:45,520 --> 00:43:48,480 Speaker 1: ones that could jog in place. And he's kind of 722 00:43:48,800 --> 00:43:53,680 Speaker 1: humanoid for sure, but very cute, all white, shiny lacquer plastic. Yeah, 723 00:43:54,120 --> 00:43:58,200 Speaker 1: you've seen him before. He was created by one of 724 00:43:58,400 --> 00:44:03,080 Speaker 1: Morey's students, who clearly subscribe to Maury's theory that you. 725 00:44:03,080 --> 00:44:05,960 Speaker 1: You're not gonna you're not gonna overcome the Uncanny Valley. 726 00:44:06,040 --> 00:44:10,440 Speaker 1: So just make these things exaggerated and non human like. Yeah, 727 00:44:10,480 --> 00:44:13,440 Speaker 1: and you'll you'll have people love your robot. Yeah. I 728 00:44:13,480 --> 00:44:16,200 Speaker 1: think that's a good tech. Yeah. All right, we're gonna 729 00:44:16,200 --> 00:44:19,120 Speaker 1: take another break here and then come back and finish 730 00:44:19,200 --> 00:44:21,680 Speaker 1: up with a little bit. We're gonna take a step 731 00:44:21,680 --> 00:44:51,839 Speaker 1: back and just talk generally about creepiness, all right. So 732 00:44:52,200 --> 00:44:55,960 Speaker 1: I promised that we would talk about creepiness. So that's 733 00:44:55,960 --> 00:44:59,440 Speaker 1: what we'll do. You promised. Chuck the creeps such a 734 00:44:59,440 --> 00:45:02,520 Speaker 1: great phrase. Everyone says it gives me the creeps. It's 735 00:45:02,560 --> 00:45:05,440 Speaker 1: just such a just it's one of those phrases that 736 00:45:05,560 --> 00:45:09,120 Speaker 1: sums things up so perfectly. It's livid as a fresh bruise. 737 00:45:10,160 --> 00:45:14,400 Speaker 1: And we have Charles Dickens to thank for this, evidently, 738 00:45:14,680 --> 00:45:18,000 Speaker 1: because he gets credit for using the creeps. In David 739 00:45:18,040 --> 00:45:22,040 Speaker 1: Copperfield in eighteen forty nine, people had had this feeling before, 740 00:45:22,600 --> 00:45:26,120 Speaker 1: this sort of unpleasant off you know what it feels 741 00:45:26,120 --> 00:45:28,080 Speaker 1: like to get the creeps. But they said things like 742 00:45:28,200 --> 00:45:31,480 Speaker 1: eel like or clammy. Not bad, not bad. But if 743 00:45:31,480 --> 00:45:34,200 Speaker 1: you said that thing makes me feel eel like. Today 744 00:45:34,239 --> 00:45:36,359 Speaker 1: people would be like, what the heck are you talking about? Right? 745 00:45:37,200 --> 00:45:39,840 Speaker 1: I think also you would use that to describe somebody 746 00:45:40,000 --> 00:45:41,920 Speaker 1: who gave you the creeps as well, like that guy's 747 00:45:41,960 --> 00:45:45,640 Speaker 1: really clammy, you know what I mean? Sure, all that 748 00:45:45,680 --> 00:45:48,399 Speaker 1: means you're touching them though, like like Peter Lorie would 749 00:45:48,440 --> 00:45:50,880 Speaker 1: be clammy or eel like in some of his characters. 750 00:45:51,840 --> 00:45:57,560 Speaker 1: You know, Peter Lourie, I do too. So everybody understands 751 00:45:57,560 --> 00:46:00,399 Speaker 1: that there is such thing as the creeps, right, but 752 00:46:01,800 --> 00:46:06,120 Speaker 1: we don't understand why we get the creeps still to 753 00:46:06,200 --> 00:46:08,440 Speaker 1: this day, and again this is important and relates to 754 00:46:08,480 --> 00:46:11,960 Speaker 1: the Uncanny Valley because another way to put the creeps 755 00:46:12,960 --> 00:46:17,440 Speaker 1: is negative affinity. Remember affinity was the x axis, and 756 00:46:17,560 --> 00:46:20,719 Speaker 1: when the valley dropped down below the X axis, you 757 00:46:20,840 --> 00:46:25,439 Speaker 1: dipped into negative affinity, into the creeps. The creeps exactly right. 758 00:46:25,520 --> 00:46:30,520 Speaker 1: So you talked about Ernst Yinch Yeah, yeah, I'd get it. 759 00:46:30,640 --> 00:46:33,879 Speaker 1: Sure it was probably the first person to actually sit 760 00:46:33,960 --> 00:46:37,799 Speaker 1: down and study the creeps or creepiness. I bet he 761 00:46:37,880 --> 00:46:40,480 Speaker 1: was creepy himself. I don't know. I think he was 762 00:46:40,520 --> 00:46:45,480 Speaker 1: just kind of a neat thinking man, right, So yinch Man, 763 00:46:45,560 --> 00:46:48,439 Speaker 1: I like saying his name a lot more now. He 764 00:46:49,120 --> 00:46:52,040 Speaker 1: wrote an essay in nineteen oh six called on the 765 00:46:52,080 --> 00:46:57,200 Speaker 1: Psychology of the Uncanny, and that's the English translation. The 766 00:46:57,280 --> 00:47:00,400 Speaker 1: German word he used, like you said, is high aimlich 767 00:47:00,880 --> 00:47:06,080 Speaker 1: Is that right, M not in unheimlich unheimlich Yeah, okay, nice, 768 00:47:06,239 --> 00:47:10,399 Speaker 1: getting better, thank you? Uh. He he used that word. 769 00:47:10,440 --> 00:47:15,440 Speaker 1: And Unheimlich is a variation of the word heimlich um, 770 00:47:15,760 --> 00:47:18,160 Speaker 1: which is not just to say the maneuver. It means 771 00:47:18,200 --> 00:47:23,560 Speaker 1: something else entirely, which is homie, we're familiar, right, Yeah. 772 00:47:23,840 --> 00:47:27,760 Speaker 1: Unheimlich is the opposite of that. It's something strange and foreign, 773 00:47:28,040 --> 00:47:31,720 Speaker 1: and very frequently is translated into uncanny here in the West, 774 00:47:31,840 --> 00:47:37,920 Speaker 1: here here in in England. Yeah, and he he has. Uh. 775 00:47:37,960 --> 00:47:39,640 Speaker 1: He thought a lot about this, and one of the 776 00:47:39,680 --> 00:47:41,960 Speaker 1: things that he noted, which I think thought was pretty interesting, 777 00:47:42,520 --> 00:47:46,240 Speaker 1: was that people that he thought were more intellectually discriminating 778 00:47:46,960 --> 00:47:51,280 Speaker 1: um are more prone to have these uncanny experiences because 779 00:47:51,320 --> 00:47:55,719 Speaker 1: they're critical thinkers about the world. Right, So uh that 780 00:47:55,800 --> 00:47:58,640 Speaker 1: makes sense, Like just they pay attention maybe a little more, yeah, 781 00:47:58,760 --> 00:48:00,960 Speaker 1: or they're curious, like they're they're like, why am I 782 00:48:01,000 --> 00:48:02,600 Speaker 1: creeped out? Let me get to the bottom of this, 783 00:48:02,719 --> 00:48:04,480 Speaker 1: rather than oh I'm creeped out him and and go 784 00:48:04,880 --> 00:48:07,759 Speaker 1: eat a whole thing at chips ahoy, and hide under 785 00:48:07,760 --> 00:48:12,080 Speaker 1: the covers. Right. He also actually went even further and said, 786 00:48:12,120 --> 00:48:16,640 Speaker 1: it's it's possible that all of humanity's knowledge has been 787 00:48:16,640 --> 00:48:23,000 Speaker 1: accrued over these millions of years from the people investigating 788 00:48:23,040 --> 00:48:26,160 Speaker 1: what's behind this creepiness. Yeah, it's a pretty weird and 789 00:48:26,280 --> 00:48:30,120 Speaker 1: neat theory of knowledge. Well. Yeah, and speaking of theories, 790 00:48:30,120 --> 00:48:33,759 Speaker 1: there are a bunch of theories on creepiness and why 791 00:48:33,760 --> 00:48:36,160 Speaker 1: this happens, and I think they're all pretty interesting. Yeah. 792 00:48:36,520 --> 00:48:39,319 Speaker 1: The first one is called pathogen avoidance theory, and we 793 00:48:39,400 --> 00:48:43,439 Speaker 1: talked earlier about evolution, and this one kind of fits 794 00:48:43,480 --> 00:48:49,400 Speaker 1: into that bucket. Basically, a warning that we have evolved 795 00:48:50,040 --> 00:48:53,200 Speaker 1: to have in our brain says that person is off, 796 00:48:53,360 --> 00:48:57,399 Speaker 1: they are diseased, even you don't want to go near them, right, 797 00:48:57,760 --> 00:49:01,879 Speaker 1: you want to avoid that pathogen makes sense. Yeah, it's 798 00:49:01,920 --> 00:49:05,919 Speaker 1: pretty pretty approachable. Sure. There's another one that I've seen 799 00:49:05,960 --> 00:49:08,920 Speaker 1: that's I think fairly recent, and it's the idea that 800 00:49:09,040 --> 00:49:14,000 Speaker 1: things give us the creeps when when they're trying to 801 00:49:14,920 --> 00:49:20,280 Speaker 1: nonverbally mimic people. Yeah, and so like a robot doesn't 802 00:49:20,320 --> 00:49:24,560 Speaker 1: do it, so we're like, oh, that's unsettling, or somebody 803 00:49:24,600 --> 00:49:27,760 Speaker 1: who you would describe as clammy or eel like maybe 804 00:49:27,880 --> 00:49:30,560 Speaker 1: overdoes it a little bit. Yeah, like they're trying to 805 00:49:30,600 --> 00:49:34,080 Speaker 1: fit in. It's not natural to them. Yeah, and that 806 00:49:34,200 --> 00:49:36,640 Speaker 1: can give you the creeps as well. That makes sense, 807 00:49:36,680 --> 00:49:40,040 Speaker 1: but it doesn't really encompass everything. It's definitely not a 808 00:49:40,600 --> 00:49:43,279 Speaker 1: unified theory of creepiness. It just seems to kind of 809 00:49:43,280 --> 00:49:48,080 Speaker 1: inhabit one corner of the creepy spectrum. Yeah, there's another 810 00:49:48,120 --> 00:49:52,920 Speaker 1: one called violation of expectation. This is like, you know, 811 00:49:53,000 --> 00:49:55,960 Speaker 1: you've shaken hands with thousands of people over your life, 812 00:49:56,280 --> 00:49:58,640 Speaker 1: but if you go and you shake a hand and 813 00:49:59,120 --> 00:50:01,680 Speaker 1: you don't know that you're going to get a prosthetic hand, 814 00:50:02,719 --> 00:50:05,680 Speaker 1: it may give you the creeps, right, And that is 815 00:50:05,719 --> 00:50:08,760 Speaker 1: probably very fleeting, because you might just say, oh, okay, 816 00:50:08,840 --> 00:50:10,839 Speaker 1: well it doesn't give me the creeps now, but it's 817 00:50:10,880 --> 00:50:14,200 Speaker 1: just unexpected for me. And actually you said that was fleeting, right, Chuck. 818 00:50:14,280 --> 00:50:18,040 Speaker 1: So I think it was unch or somebody who said 819 00:50:18,800 --> 00:50:23,399 Speaker 1: that creepiness what gives us the creeps one time might 820 00:50:23,440 --> 00:50:26,000 Speaker 1: not give us the creeps later on, Yeah, which will 821 00:50:26,080 --> 00:50:29,400 Speaker 1: kind of come into play later like Ernst Junch. Basically 822 00:50:29,440 --> 00:50:32,520 Speaker 1: he laid the groundwork for the study of creepiness, and 823 00:50:32,760 --> 00:50:35,000 Speaker 1: it seems to have gotten a lot of it right 824 00:50:35,120 --> 00:50:38,240 Speaker 1: right out of the gate. Yeah, and like you said, 825 00:50:38,320 --> 00:50:40,960 Speaker 1: if it doesn't give you the creeps later, then that 826 00:50:41,000 --> 00:50:46,000 Speaker 1: would fit neatly into the violation of expectation, because then 827 00:50:46,040 --> 00:50:50,000 Speaker 1: you can change your expectation, right exactly, Yes. Yes. Another 828 00:50:50,040 --> 00:50:55,760 Speaker 1: one's mortality salience theory. Yeah, this one Mori and Freud 829 00:50:55,800 --> 00:51:01,759 Speaker 1: both subscribed to, and it basically said that we when 830 00:51:01,800 --> 00:51:06,200 Speaker 1: we encounter like a robot or an automaton in Freud's day, 831 00:51:07,280 --> 00:51:09,719 Speaker 1: they remind us of dead people, which in turn gets 832 00:51:09,719 --> 00:51:12,000 Speaker 1: our mind to thinking about how we're going to die 833 00:51:12,080 --> 00:51:13,879 Speaker 1: one day, And so all of a sudden we find 834 00:51:13,880 --> 00:51:18,040 Speaker 1: ourselves in the uncanny valley, right, which again raises another 835 00:51:18,080 --> 00:51:21,440 Speaker 1: sorry for the sidetrack, but raises another of Unch's points. 836 00:51:22,680 --> 00:51:27,960 Speaker 1: Is uncanniness inherent in the object or is it inside 837 00:51:28,000 --> 00:51:31,919 Speaker 1: the observer who's experiencing the creeps or uncannyness? I think 838 00:51:31,920 --> 00:51:35,160 Speaker 1: it's in the observer. Yeah, I think it is, too, 839 00:51:35,160 --> 00:51:38,239 Speaker 1: which would explain why it can go away you when 840 00:51:38,239 --> 00:51:42,319 Speaker 1: you come to experience it again. Yeah, like this, when 841 00:51:42,360 --> 00:51:44,560 Speaker 1: you go through the when you shake the same prosthetic 842 00:51:44,600 --> 00:51:47,719 Speaker 1: hand again, it's not creepy the second time. It might 843 00:51:47,760 --> 00:51:51,880 Speaker 1: even be interesting, or why some people might not experience 844 00:51:51,920 --> 00:51:54,520 Speaker 1: it at all. Like someone might sit there and see 845 00:51:54,560 --> 00:51:56,640 Speaker 1: a doll and the doll's head turns and looks at them, 846 00:51:56,680 --> 00:51:59,839 Speaker 1: and they're like, neat, right, how much for that doll? 847 00:52:00,160 --> 00:52:03,560 Speaker 1: Which means you've just met a serial killer? Right? And 848 00:52:03,600 --> 00:52:07,560 Speaker 1: then the dolls creeped out after that? This one I 849 00:52:07,600 --> 00:52:10,279 Speaker 1: like them, even though I can never say this word 850 00:52:10,280 --> 00:52:15,400 Speaker 1: for some reason. And thro po morphism, nice job, dehumanization dichotomy, 851 00:52:15,440 --> 00:52:19,080 Speaker 1: which basically as we attribute these human attributes to the 852 00:52:19,160 --> 00:52:24,239 Speaker 1: robot until we realize that they don't have them, right. So, 853 00:52:24,360 --> 00:52:26,680 Speaker 1: like we're looking at this robot that looks like a person. 854 00:52:26,719 --> 00:52:28,880 Speaker 1: We're saying, oh, look, it's just like a human. And 855 00:52:28,920 --> 00:52:30,839 Speaker 1: they're walking and they're talking and they're smiling, and then 856 00:52:30,840 --> 00:52:33,600 Speaker 1: oh god, look at their eyes. Their eyes are dead. 857 00:52:33,680 --> 00:52:35,839 Speaker 1: Look at the eyes. They don't they don't have any 858 00:52:35,920 --> 00:52:39,239 Speaker 1: internal thoughts at all. They're not human. Yeah, And then 859 00:52:39,320 --> 00:52:42,080 Speaker 1: all of a sudden, on Caney Valley, which is a 860 00:52:42,080 --> 00:52:45,360 Speaker 1: little bit about expectation too, I think those are crawls 861 00:52:45,360 --> 00:52:49,799 Speaker 1: over a little I think sure, And so creepiness, I think, 862 00:52:50,480 --> 00:52:55,080 Speaker 1: especially the modern incarnation of creepiness. This is these are 863 00:52:55,120 --> 00:52:59,400 Speaker 1: my thoughts. They seem to be They represent a crossroads 864 00:52:59,480 --> 00:53:07,040 Speaker 1: right where evolutionarily, creepiness I think was probably it alerts us. 865 00:53:07,040 --> 00:53:09,800 Speaker 1: We're on alert when something's creeping us out. We're really 866 00:53:09,840 --> 00:53:13,000 Speaker 1: focused on that thing, right then, Ye, but we're also 867 00:53:13,600 --> 00:53:17,319 Speaker 1: bound by society not to just turn and run from 868 00:53:17,360 --> 00:53:20,560 Speaker 1: anything that could conceivably be a threat. You can also 869 00:53:20,600 --> 00:53:23,800 Speaker 1: take it a little further and say that evolutionarily speaking, 870 00:53:24,120 --> 00:53:25,840 Speaker 1: it would not make sense for us to turn and 871 00:53:25,920 --> 00:53:28,480 Speaker 1: run from every single thing that could conceivably be a 872 00:53:28,520 --> 00:53:31,160 Speaker 1: threat before we've identified it as a threat, because we 873 00:53:31,200 --> 00:53:33,680 Speaker 1: would be using up a lot of calories and energy, 874 00:53:33,840 --> 00:53:35,680 Speaker 1: and we would have to find more food than we do. 875 00:53:35,719 --> 00:53:39,080 Speaker 1: Would be inefficient. Right, So we're kind of bound socially 876 00:53:39,120 --> 00:53:42,560 Speaker 1: to stand in place until we identify something as a 877 00:53:42,600 --> 00:53:46,839 Speaker 1: threat or not, in which case, during this period, that's 878 00:53:46,880 --> 00:53:50,160 Speaker 1: when we experience creepiness. Yeah, And I think everyone has 879 00:53:50,200 --> 00:53:53,080 Speaker 1: experienced this. Like you're in a coffee shop or something 880 00:53:53,719 --> 00:53:58,080 Speaker 1: and like some super creepy dude comes in and if 881 00:53:58,120 --> 00:54:01,600 Speaker 1: you're liking me, you're just like, all right, I'm gonna 882 00:54:01,760 --> 00:54:04,399 Speaker 1: I'm gonna keep my eye on that guy. I'm I'm 883 00:54:04,400 --> 00:54:06,680 Speaker 1: not gonna bolt and run, but it might stay near 884 00:54:06,680 --> 00:54:09,400 Speaker 1: the door. Sure, you know, I might get my car 885 00:54:09,480 --> 00:54:13,719 Speaker 1: keys ready exactly right. Uh, it is it's this weird 886 00:54:13,760 --> 00:54:17,840 Speaker 1: social contract. Um, and you know I feel bad for 887 00:54:17,960 --> 00:54:21,439 Speaker 1: people that just inherently look a little creepy. Well yeah, 888 00:54:21,560 --> 00:54:24,359 Speaker 1: let's talk about that. Yeah. So there was these there 889 00:54:24,360 --> 00:54:27,719 Speaker 1: are these researchers from Knox College who did what they 890 00:54:27,760 --> 00:54:30,640 Speaker 1: build as the first empirical study of creepiness. And this 891 00:54:30,680 --> 00:54:33,720 Speaker 1: is in twenty sixteen. Such a great study and um, 892 00:54:33,760 --> 00:54:37,319 Speaker 1: it was an online survey, very little heavy lifting, but 893 00:54:37,360 --> 00:54:40,440 Speaker 1: it was a pretty pretty cool survey. It was in 894 00:54:40,480 --> 00:54:46,240 Speaker 1: four parts, and um, what they found overall was that, yeah, 895 00:54:46,400 --> 00:54:51,680 Speaker 1: physical characteristics, physical traits that are almost stereotypically linked to 896 00:54:52,000 --> 00:54:57,480 Speaker 1: creepy people do have an effect. They are creepy. Um, 897 00:54:57,520 --> 00:55:01,399 Speaker 1: as far as as the participants in this that are concerned. Yeah. 898 00:55:02,000 --> 00:55:06,160 Speaker 1: So the first section said, hey, you know, what is 899 00:55:06,160 --> 00:55:09,200 Speaker 1: the likelihood that this person is creepy? And there's like, 900 00:55:09,600 --> 00:55:14,200 Speaker 1: you know, descriptions of them. There's forty four different behaviors, right, Yeah, 901 00:55:14,840 --> 00:55:18,480 Speaker 1: And the second part was participants rated the creepiness of 902 00:55:18,480 --> 00:55:21,160 Speaker 1: twenty one different occupations. I love to see that list. 903 00:55:22,280 --> 00:55:26,279 Speaker 1: The third section it said, list two hobbies that you 904 00:55:26,320 --> 00:55:29,919 Speaker 1: think are creepy. They only needed too. It was open 905 00:55:30,040 --> 00:55:34,360 Speaker 1: ended too. And then the last section the participants said 906 00:55:34,600 --> 00:55:37,359 Speaker 1: whether or not they agreed with fifteen statements about the 907 00:55:37,440 --> 00:55:41,680 Speaker 1: nature of creepy people. Yeah, and overall again like they 908 00:55:41,719 --> 00:55:45,320 Speaker 1: found like, yes, if you have physical traits that people 909 00:55:45,360 --> 00:55:48,720 Speaker 1: find creepy, like bulging eyes, or you lick your lips 910 00:55:48,760 --> 00:55:51,720 Speaker 1: a lot, or you know, you you arch your fingers 911 00:55:51,719 --> 00:55:55,280 Speaker 1: and then just kind of tap them together a lot, Okay, 912 00:55:55,719 --> 00:55:59,920 Speaker 1: it's kind of creepy. But the Knox researchers concluded that 913 00:56:00,040 --> 00:56:03,640 Speaker 1: those aren't creepy necessarily in and of themselves. It's when 914 00:56:03,680 --> 00:56:08,600 Speaker 1: it's in conjunction with other creepy behavior that somebody comes 915 00:56:08,600 --> 00:56:13,360 Speaker 1: across as creepy. Right, And of course, the one behavior 916 00:56:13,400 --> 00:56:15,280 Speaker 1: they put in here I think there was probably universally 917 00:56:15,280 --> 00:56:19,640 Speaker 1: creepy was someone who persistently steers the conversation toward a 918 00:56:19,640 --> 00:56:22,640 Speaker 1: sexual topic, right, yeah, you don't, you don't do that. 919 00:56:23,239 --> 00:56:27,440 Speaker 1: They also found no. They also found ninety five percent 920 00:56:27,480 --> 00:56:30,200 Speaker 1: of participants and this is like I think eighteen hundred 921 00:56:30,200 --> 00:56:33,600 Speaker 1: and thirteen hundred and forty one people, ninety five percent 922 00:56:33,680 --> 00:56:36,160 Speaker 1: of them said that men were more likely to be 923 00:56:36,239 --> 00:56:40,600 Speaker 1: creepy than women. Yeah, I think that's generally true. Um, 924 00:56:41,480 --> 00:56:44,319 Speaker 1: I don't remember getting the creeps a lot in my 925 00:56:44,400 --> 00:56:48,960 Speaker 1: life by strictly from the appearance of a woman, right, 926 00:56:49,080 --> 00:56:51,440 Speaker 1: But a lot of dudes on a weekly basis give 927 00:56:51,520 --> 00:56:54,759 Speaker 1: me the creeps. Sure, but we should say so. There's 928 00:56:54,760 --> 00:56:59,000 Speaker 1: a website called girl dot com touurl dot com, and 929 00:56:59,080 --> 00:57:01,160 Speaker 1: they went on to read it and found a thread 930 00:57:01,160 --> 00:57:03,960 Speaker 1: somewhere that they wrote a blog post about and now 931 00:57:03,960 --> 00:57:07,280 Speaker 1: we're reporting on it, so it's really come full circle. 932 00:57:08,200 --> 00:57:11,000 Speaker 1: But it was a threat about how women can be creepy, 933 00:57:11,120 --> 00:57:15,359 Speaker 1: and it was written by dudes. And there are some 934 00:57:15,440 --> 00:57:22,080 Speaker 1: things that apparently are universally creepy among boys with women. Right. Yeah, 935 00:57:22,280 --> 00:57:26,120 Speaker 1: women that are too needy can be creepy. Women who 936 00:57:26,200 --> 00:57:30,000 Speaker 1: use baby talk too much, or who quote never leave 937 00:57:30,040 --> 00:57:33,960 Speaker 1: a guy alone. Yeah, I just I'm just gonna go 938 00:57:34,000 --> 00:57:37,880 Speaker 1: ahead and dump that right into the trash bend. That's 939 00:57:37,920 --> 00:57:42,080 Speaker 1: my only comment on that. Okay, what about e harmony? 940 00:57:42,240 --> 00:57:45,040 Speaker 1: I mean, if you come home and Glenn closes in 941 00:57:45,040 --> 00:57:50,439 Speaker 1: your kitchen boiling your pet bunny. Well that's a threat. Yeah, 942 00:57:50,480 --> 00:57:53,600 Speaker 1: that's not even creepy, that's just a threat. Right. Although 943 00:57:53,600 --> 00:57:58,200 Speaker 1: I will say in Fatal Attraction, the scene where she 944 00:57:58,400 --> 00:58:01,680 Speaker 1: is sitting there clicking the light on and off, listening 945 00:58:01,680 --> 00:58:04,520 Speaker 1: to Madam Butterfly, that was that was kind of creepy. 946 00:58:04,880 --> 00:58:08,760 Speaker 1: I was trying to think of like a creepy woman. 947 00:58:09,080 --> 00:58:11,760 Speaker 1: I really couldn't come up with anybody. Well, these are 948 00:58:11,800 --> 00:58:15,160 Speaker 1: creepy behaviors though, you know, yeah, not like Glencoe's close 949 00:58:15,240 --> 00:58:17,360 Speaker 1: walked into the room and you're like, oh, I don't 950 00:58:17,360 --> 00:58:22,520 Speaker 1: know about that, right, right, right, there's a difference, right, 951 00:58:22,560 --> 00:58:28,160 Speaker 1: There's a difference between genuine creepiness and just doing creepy things. Yeah. 952 00:58:28,200 --> 00:58:29,840 Speaker 1: I think it is much harder for women to be 953 00:58:29,880 --> 00:58:33,960 Speaker 1: creepy than men. Cannot think of a single actual creepy woman, No, 954 00:58:35,960 --> 00:58:40,680 Speaker 1: I'd like to hear from people, though. Yeah. E harmony, 955 00:58:41,280 --> 00:58:44,520 Speaker 1: So we've talked about Reddit, now we're gonna talk about 956 00:58:44,560 --> 00:58:47,520 Speaker 1: e harmony. Right. They had an article where they wrote 957 00:58:47,520 --> 00:58:50,400 Speaker 1: advice to dudes. It was called how to avoid the 958 00:58:50,400 --> 00:58:56,680 Speaker 1: creep Zone, right, and their advice was for your hobbies 959 00:58:56,680 --> 00:58:59,800 Speaker 1: that you list to be just sort of vanilla, all right, 960 00:59:00,120 --> 00:59:04,160 Speaker 1: don't like. And even if you are an amateur taxidermist, 961 00:59:04,520 --> 00:59:08,280 Speaker 1: maybe don't put that down. Right. They said, it can 962 00:59:08,320 --> 00:59:10,520 Speaker 1: be attractive for a guy to have an off the 963 00:59:10,600 --> 00:59:14,200 Speaker 1: beaten path hobby. And one of the examples they gave 964 00:59:14,240 --> 00:59:17,840 Speaker 1: of an off the beaten path hobby was collecting punk records. 965 00:59:19,120 --> 00:59:22,360 Speaker 1: But don't get weirder than that. Yeah, And if you 966 00:59:22,400 --> 00:59:25,080 Speaker 1: know taxidermy in and of itself, some people might say 967 00:59:25,120 --> 00:59:27,120 Speaker 1: a super creepy we did an episode on that. Other 968 00:59:27,160 --> 00:59:30,040 Speaker 1: people might say, no, it's just just beautiful artwork, right, 969 00:59:30,280 --> 00:59:33,240 Speaker 1: But Norman Bates was into taxidermy for a reason. In 970 00:59:33,280 --> 00:59:38,040 Speaker 1: Psycho right, it was unsettling. Yeah, you know. Yeah, And 971 00:59:38,080 --> 00:59:44,240 Speaker 1: so the Knox people who carried out this survey, the 972 00:59:44,320 --> 00:59:48,120 Speaker 1: Knox University researchers, they basically said, here's what we think 973 00:59:48,200 --> 00:59:52,160 Speaker 1: it is. Here's creepiness explained. And what they explained was 974 00:59:52,200 --> 00:59:57,120 Speaker 1: what can be called is the threat ambiguity theory. Yeah, 975 00:59:57,120 --> 00:59:58,840 Speaker 1: this this one, I think they kind of put a 976 00:59:58,920 --> 01:00:01,000 Speaker 1: cherry on top on this one. Yeah, we really did. 977 01:00:01,040 --> 01:00:04,080 Speaker 1: Like it's just basically where you are creeped out by 978 01:00:04,120 --> 01:00:07,919 Speaker 1: something because your hackles are raised right then, and it's 979 01:00:07,960 --> 01:00:11,600 Speaker 1: because you haven't determined whether that thing's a threat or not. Yep. Right. 980 01:00:12,320 --> 01:00:15,800 Speaker 1: There's another one though that I subscribe to. I think 981 01:00:15,840 --> 01:00:19,600 Speaker 1: it is finally the unified theory of creepiness. I think 982 01:00:19,600 --> 01:00:25,160 Speaker 1: it covers everything. And it's called the category ambiguity theory. Yeah. 983 01:00:25,200 --> 01:00:28,960 Speaker 1: That was Now did David Livingstone Smith make this up 984 01:00:29,080 --> 01:00:32,600 Speaker 1: or was he just champion this? I think he made 985 01:00:32,600 --> 01:00:36,600 Speaker 1: it up because he wrote about the Knox researchers and said, 986 01:00:36,720 --> 01:00:40,120 Speaker 1: what they're talking about you can call threat ambiguity category 987 01:00:40,520 --> 01:00:44,480 Speaker 1: or threat ambiguity theory. With category ambiguity theory, he didn't 988 01:00:44,480 --> 01:00:47,880 Speaker 1: cite anybody else, so it seemed to be his own construct. Yeah, 989 01:00:47,880 --> 01:00:49,760 Speaker 1: so this is the idea. It's sort of like the 990 01:00:49,800 --> 01:00:55,320 Speaker 1: threat ambiguity in that there is some confusion, but it's 991 01:00:55,320 --> 01:00:56,840 Speaker 1: not a threat like I think this dude in the 992 01:00:56,840 --> 01:00:59,720 Speaker 1: coffee shop is gonna kill me. It's more like I 993 01:00:59,760 --> 01:01:03,040 Speaker 1: don't know how to categorize that guy, right, and that 994 01:01:03,080 --> 01:01:08,320 Speaker 1: freaks me out right. And it's based on what's called essentialism, right, 995 01:01:08,600 --> 01:01:11,880 Speaker 1: where if you are a member of a species of animal, 996 01:01:11,960 --> 01:01:17,120 Speaker 1: whether human or raccoon or tiger, there's something about you, 997 01:01:17,720 --> 01:01:20,919 Speaker 1: or there's some collection or set of things about you 998 01:01:21,360 --> 01:01:26,360 Speaker 1: that are totally unique to your species. Yeah, it's something 999 01:01:26,440 --> 01:01:30,600 Speaker 1: you possess because you're a member that species. And because 1000 01:01:30,640 --> 01:01:33,400 Speaker 1: you're a member that species, you possess these things and 1001 01:01:33,440 --> 01:01:35,840 Speaker 1: it can be very difficult to put your finger on it. 1002 01:01:35,960 --> 01:01:38,000 Speaker 1: But it's just one of those things that you know 1003 01:01:38,080 --> 01:01:40,560 Speaker 1: when you see it, or no, when you don't see it, right, Yeah, 1004 01:01:40,800 --> 01:01:44,520 Speaker 1: And there are clear borders between these things. You either 1005 01:01:44,640 --> 01:01:47,320 Speaker 1: have this essence fully or you don't have it at all. 1006 01:01:47,360 --> 01:01:50,080 Speaker 1: You're lacking and you're missing it and something's really wrong. 1007 01:01:50,640 --> 01:01:53,560 Speaker 1: So in this article he used the example of a 1008 01:01:53,560 --> 01:01:57,960 Speaker 1: wax dummy. Yeah, have you ever been to like Madame Tussod's, Sure, Yeah, 1009 01:01:58,680 --> 01:02:00,520 Speaker 1: I find that the ones and again with the eyes, 1010 01:02:00,560 --> 01:02:03,120 Speaker 1: the ones that work the best, or the ones where 1011 01:02:03,120 --> 01:02:06,880 Speaker 1: they have sunglasses on. Oh yeah, you know again, Michael Jackson, 1012 01:02:07,400 --> 01:02:10,080 Speaker 1: that's right. But the whole point with these wax dummies 1013 01:02:10,080 --> 01:02:13,640 Speaker 1: with the eyes is they're fixed. They're not moving around. 1014 01:02:13,840 --> 01:02:17,960 Speaker 1: The facial expression is locked in the skin itself. You know, 1015 01:02:18,480 --> 01:02:21,160 Speaker 1: it can only do so much. And Madam Tusso's and 1016 01:02:21,280 --> 01:02:23,960 Speaker 1: museums like that are the best of the best, and 1017 01:02:24,000 --> 01:02:26,560 Speaker 1: they do look pretty good. But that's the whole point 1018 01:02:26,560 --> 01:02:29,840 Speaker 1: with the uncanny Valley is you can't get ninety nine 1019 01:02:30,360 --> 01:02:33,840 Speaker 1: there and say we're fine. It's that one percent that 1020 01:02:34,000 --> 01:02:38,920 Speaker 1: still gives people the creeps exactly. And that's and it 1021 01:02:39,000 --> 01:02:42,200 Speaker 1: sums up everything, like the threat ambiguity could fall into this, 1022 01:02:42,560 --> 01:02:45,680 Speaker 1: whether you're talking about robots, whether you're talking about a 1023 01:02:45,680 --> 01:02:49,760 Speaker 1: half dog half lizard combo, which living Stone sites or 1024 01:02:49,800 --> 01:02:53,800 Speaker 1: living Stone Smith sites the desert. Yeah, a dessert would 1025 01:02:53,800 --> 01:02:57,040 Speaker 1: be creepy when you saw it. Yeah, but so things 1026 01:02:57,080 --> 01:03:00,800 Speaker 1: that are a threat are creepy. But there's also things 1027 01:03:00,800 --> 01:03:03,320 Speaker 1: that are creepy that aren't a threat in this category. 1028 01:03:03,320 --> 01:03:07,080 Speaker 1: Ambiguity theory figured it out. So if that's true, Chuck 1029 01:03:07,520 --> 01:03:10,720 Speaker 1: and David Livingstone Smith figured out what is the basis 1030 01:03:10,760 --> 01:03:16,600 Speaker 1: of creepiness, we finally have the independent variable lickt and 1031 01:03:17,040 --> 01:03:20,600 Speaker 1: Massa hiro Morty's Uncanny Valley graph, and we can get 1032 01:03:20,600 --> 01:03:24,600 Speaker 1: to work. Is he still around? Yeah? Yes, when that's 1033 01:03:24,680 --> 01:03:28,840 Speaker 1: happy about all this, I get the impression that he's 1034 01:03:28,920 --> 01:03:31,919 Speaker 1: kind of like, just whatever, gone off on his own 1035 01:03:31,960 --> 01:03:35,160 Speaker 1: little thing, Okay, and he's fine. He wrote it in 1036 01:03:35,200 --> 01:03:38,000 Speaker 1: nineteen seventy after all, you know, Yeah, I mean most 1037 01:03:38,080 --> 01:03:42,880 Speaker 1: fifty years ago. Yeah, so he's probably up there yeah, uh, 1038 01:03:43,400 --> 01:03:46,000 Speaker 1: are you anything else? I got nothing else? Good ones. Yeah, 1039 01:03:46,240 --> 01:03:48,200 Speaker 1: if you want to know more about the Uncanny Valley, 1040 01:03:48,280 --> 01:03:50,560 Speaker 1: we should say this was based originally on a grabst 1041 01:03:50,760 --> 01:03:53,520 Speaker 1: article too. But if you want to know more about 1042 01:03:53,520 --> 01:03:56,200 Speaker 1: the Uncanny Valley, cum read that Grabster article. You can 1043 01:03:56,240 --> 01:03:59,480 Speaker 1: type Uncanny Valley in the search bar at HowStuffWorks dot com. 1044 01:03:59,520 --> 01:04:04,840 Speaker 1: And since I said search bars, time for listener mail. Well, 1045 01:04:04,880 --> 01:04:07,440 Speaker 1: and today it's a very special listener mail. This is 1046 01:04:07,920 --> 01:04:10,640 Speaker 1: Josh edition because you picked out a very special one. 1047 01:04:10,920 --> 01:04:13,120 Speaker 1: I love this one. I'm going to butcher the dude's name, 1048 01:04:13,160 --> 01:04:15,480 Speaker 1: but that's right, take it away. It's a good one. Okay. 1049 01:04:16,200 --> 01:04:19,760 Speaker 1: I'm going to call this one email from a real 1050 01:04:19,880 --> 01:04:24,520 Speaker 1: Irish historian and it feels pretty good. Chuck, am I 1051 01:04:24,520 --> 01:04:28,000 Speaker 1: out of a job? Yeah? Maybe? Okay. Hi guys, I'm 1052 01:04:28,000 --> 01:04:30,440 Speaker 1: a big fan of the show. It's informative and insightful, 1053 01:04:30,480 --> 01:04:32,560 Speaker 1: and I find myself interested in things that I never 1054 01:04:32,640 --> 01:04:36,360 Speaker 1: looked twice at before. One subject that I'd always found 1055 01:04:36,360 --> 01:04:40,200 Speaker 1: fascinating was the correlation between the Native American Choctaw tribe 1056 01:04:40,400 --> 01:04:42,560 Speaker 1: and the people of Ireland. I didn't realize that was 1057 01:04:42,600 --> 01:04:44,960 Speaker 1: a thing that you not at all. This is a 1058 01:04:45,000 --> 01:04:47,480 Speaker 1: story which isn't well known, Okay, which isn't well known 1059 01:04:47,480 --> 01:04:50,160 Speaker 1: outside of some areas of Ireland and of course within 1060 01:04:50,200 --> 01:04:52,480 Speaker 1: the tribe. But it's a really good story of solidarity 1061 01:04:52,520 --> 01:04:55,560 Speaker 1: between two groups of people, despite being thousands of miles apart. 1062 01:04:56,040 --> 01:04:58,280 Speaker 1: Less than twenty years after the Trail of Tears, which 1063 01:04:58,320 --> 01:05:02,440 Speaker 1: forcibly displaced thousand of natives, the Great Famine hit Ireland. 1064 01:05:02,760 --> 01:05:05,040 Speaker 1: During this time, as you know, Ireland was colonized by 1065 01:05:05,080 --> 01:05:07,720 Speaker 1: the British and the people of Ireland were treated poorly 1066 01:05:07,920 --> 01:05:11,160 Speaker 1: due to the common misconception that Irish Catholics were lower 1067 01:05:11,200 --> 01:05:14,680 Speaker 1: caliber of human. He goes on to give more examples, 1068 01:05:14,680 --> 01:05:17,439 Speaker 1: but just suffice to say it was not good for 1069 01:05:17,560 --> 01:05:21,200 Speaker 1: the Irish people. During the famine, words spread to America 1070 01:05:21,320 --> 01:05:24,560 Speaker 1: and to the Choctaw tribe. They sympathized with the Irish 1071 01:05:24,560 --> 01:05:27,560 Speaker 1: people so much that only fifteen years after the Trail 1072 01:05:27,600 --> 01:05:31,200 Speaker 1: of Tears, they donated seven hundred and ten dollars during 1073 01:05:31,240 --> 01:05:34,480 Speaker 1: eighteen forty five to send to Ireland as part of 1074 01:05:34,480 --> 01:05:37,720 Speaker 1: a relief fund. This is estimated to be roughly sixty 1075 01:05:37,760 --> 01:05:41,280 Speaker 1: eight thousand dollars in today's money. This was greatly appreciated 1076 01:05:41,320 --> 01:05:44,040 Speaker 1: by the Irish people, and after the famine, the bond continued. 1077 01:05:44,440 --> 01:05:47,200 Speaker 1: In Cork, we have a sculpture honoring the tribute of 1078 01:05:47,240 --> 01:05:50,280 Speaker 1: the Choctaw people, and in nineteen ninety members of the 1079 01:05:50,320 --> 01:05:52,800 Speaker 1: tribe came to Ireland and walked the Famine Walk in 1080 01:05:52,880 --> 01:05:56,000 Speaker 1: Mayo to replicate the walk that starving people made to 1081 01:05:56,000 --> 01:05:59,080 Speaker 1: ask the landlord for help. In nineteen ninety two, an 1082 01:05:59,120 --> 01:06:03,320 Speaker 1: Irish commemoration group walked from Oklahoma to Mission to replicate 1083 01:06:03,360 --> 01:06:06,960 Speaker 1: the Trail of Tears and raised seven hundred thousand dollars 1084 01:06:07,280 --> 01:06:10,560 Speaker 1: to help poverty in Africa. These two groups continue to 1085 01:06:10,600 --> 01:06:13,000 Speaker 1: work together and to this day our President has declared 1086 01:06:13,000 --> 01:06:16,040 Speaker 1: an honorary member of the Choctaw tribe. Along with the 1087 01:06:16,120 --> 01:06:18,520 Speaker 1: Quakers who fed Irish people to the point that their 1088 01:06:18,560 --> 01:06:22,320 Speaker 1: members ended starving themselves, the Choctaw tribe remained some of 1089 01:06:22,360 --> 01:06:25,640 Speaker 1: the unsung heroes of the famine story of Ireland. Sorry 1090 01:06:25,640 --> 01:06:27,680 Speaker 1: it went on so long. I'm an Irish historian, so 1091 01:06:27,720 --> 01:06:29,920 Speaker 1: I tend to waffle. Love the show. Best of luck 1092 01:06:29,960 --> 01:06:37,240 Speaker 1: with yourselves, Royson kill roy Bean, fantastic, great story. Thanks 1093 01:06:37,240 --> 01:06:39,400 Speaker 1: a lot, Roisen. I'm quite sure that it's not the 1094 01:06:39,440 --> 01:06:41,720 Speaker 1: actual pronunciation of your name, because there's a lot of 1095 01:06:41,760 --> 01:06:45,880 Speaker 1: accent marks over letters there normally aren't, ye, So I 1096 01:06:45,880 --> 01:06:48,880 Speaker 1: apologize for that, but I nailed your last name. I'm 1097 01:06:48,880 --> 01:06:52,200 Speaker 1: positive of it. And Josh Clark three and a half stars, 1098 01:06:52,200 --> 01:06:54,560 Speaker 1: not bad out of three and a half. Right, I 1099 01:06:54,600 --> 01:06:57,439 Speaker 1: don't remember what was star search the four stars? Oh, 1100 01:06:57,480 --> 01:07:00,560 Speaker 1: I don't remember. I just just now remember there was 1101 01:07:00,600 --> 01:07:04,520 Speaker 1: such thing as Starson. Yeah, well okay, well you take 1102 01:07:04,520 --> 01:07:07,520 Speaker 1: that in part, Chuck, since I took listenermail, Oh geez, 1103 01:07:08,800 --> 01:07:12,120 Speaker 1: thanks for listening. Um. Hey, if you want to get 1104 01:07:12,160 --> 01:07:14,240 Speaker 1: in touch with us, you can go to our official pages. 1105 01:07:14,840 --> 01:07:18,440 Speaker 1: Stuff you Should Know podcasts, what else? Let's see if 1106 01:07:18,440 --> 01:07:20,800 Speaker 1: they want to send us an email? Oh yeah, email 1107 01:07:20,880 --> 01:07:24,920 Speaker 1: us at stuff podcast at HowStuffWorks dot com. And have 1108 01:07:25,000 --> 01:07:27,560 Speaker 1: a good day. Is that what you say? That's good enough? 1109 01:07:27,960 --> 01:07:33,640 Speaker 1: All right? Stuff you Should Know is a production of iHeartRadio. 1110 01:07:34,120 --> 01:07:37,280 Speaker 1: For more podcasts My heart Radio, visit the iHeartRadio app, 1111 01:07:37,480 --> 01:07:40,400 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.