1 00:00:01,120 --> 00:00:04,600 Speaker 1: Welcome to Stuff you Should Know from how Stuff Works 2 00:00:04,600 --> 00:00:13,600 Speaker 1: dot Com. Hey, and welcome to the podcast. I'm Josh Clark, 3 00:00:13,640 --> 00:00:16,599 Speaker 1: and there's Charles W. Chuck Bryant. There's Jerry over there 4 00:00:16,640 --> 00:00:20,279 Speaker 1: doing the robot, which means this is stuff you should know. 5 00:00:22,200 --> 00:00:26,000 Speaker 1: Robot stot. I knew i'd get a laughout. EA, sooner 6 00:00:26,040 --> 00:00:28,680 Speaker 1: or later? Did you do the robot? Can you do 7 00:00:28,720 --> 00:00:31,520 Speaker 1: the robot? I think I've seen you do pretty bad robot. 8 00:00:32,159 --> 00:00:35,080 Speaker 1: I don't know about pretty bad robot. I can do 9 00:00:35,120 --> 00:00:38,720 Speaker 1: a pretty great robot, that's what you've seen. I can't 10 00:00:38,760 --> 00:00:41,480 Speaker 1: do any of that stuff. Yeah, I can't really either, 11 00:00:42,080 --> 00:00:45,800 Speaker 1: and didn't really. If if you're claim to fame is 12 00:00:45,880 --> 00:00:49,240 Speaker 1: like a really great robot dance, I don't know, maybe 13 00:00:49,720 --> 00:00:52,360 Speaker 1: take up some other hobbies as well, kind of round 14 00:00:52,400 --> 00:00:54,880 Speaker 1: that out. You don't want them to be the only 15 00:00:54,920 --> 00:00:58,360 Speaker 1: thing you're good at, right, because if you list that 16 00:00:58,680 --> 00:01:03,840 Speaker 1: on the dating site, you might turn ladies off. Yeah, 17 00:01:04,520 --> 00:01:07,920 Speaker 1: according to e Harmony, Yeah, that's foreshadowing. I love that one, 18 00:01:07,959 --> 00:01:12,600 Speaker 1: don't you. Yeah, that's some issues with that whole Oh yeah, yeah, 19 00:01:12,880 --> 00:01:15,400 Speaker 1: we'll get to that. This is tad all right. Well, 20 00:01:15,440 --> 00:01:18,160 Speaker 1: let's start the at the beginning. Almost the beginning. Chuck, 21 00:01:18,240 --> 00:01:21,240 Speaker 1: let's go back to nineteen seventy, which was the beginning 22 00:01:21,280 --> 00:01:24,319 Speaker 1: of the greatest decade in the history of humanity. Yeah, 23 00:01:24,440 --> 00:01:26,680 Speaker 1: neither one of us were born yet I can finally 24 00:01:26,720 --> 00:01:29,840 Speaker 1: even say that I'm still not even born. It must 25 00:01:29,880 --> 00:01:36,680 Speaker 1: feel good. Yeah, okay, well, welcome to the club. I guess. Thanks. 26 00:01:37,000 --> 00:01:39,360 Speaker 1: And in nineteen seventy, we're not just going just anywhere 27 00:01:39,360 --> 00:01:42,000 Speaker 1: at nights seven, We're going to Japan in nineteen seventy. 28 00:01:42,640 --> 00:01:45,479 Speaker 1: Japan was pretty cool in the seventies. Yeah, a lot 29 00:01:45,480 --> 00:01:50,080 Speaker 1: of bell bottoms, a lot of ninja running around. Still, um, 30 00:01:50,560 --> 00:01:55,960 Speaker 1: there were calculators being wheelded all over the place. Probably 31 00:01:56,160 --> 00:01:58,920 Speaker 1: it was a good time, good good time for Japan, right, 32 00:02:00,280 --> 00:02:02,920 Speaker 1: and one of the one of the things that was 33 00:02:03,000 --> 00:02:05,880 Speaker 1: going on in nineteen seventy. I could not, for the 34 00:02:05,920 --> 00:02:10,120 Speaker 1: life of me find what issue of this journal it 35 00:02:10,200 --> 00:02:12,280 Speaker 1: came out in what month, But at some point in 36 00:02:12,400 --> 00:02:17,800 Speaker 1: nineteen seventy there was an obscure journal, a Japanese academic 37 00:02:17,880 --> 00:02:21,640 Speaker 1: journal called Energy, and at some point during that year 38 00:02:21,680 --> 00:02:27,080 Speaker 1: it published a article by a Japanese roboticist, and his 39 00:02:27,200 --> 00:02:31,760 Speaker 1: name is Massa Hiro Mori. Thank you, I have a 40 00:02:31,760 --> 00:02:36,120 Speaker 1: lot of practice, and uh, Massa Hiro Omori uh published 41 00:02:36,120 --> 00:02:39,200 Speaker 1: this article and he named it Bukimi no Tani gen 42 00:02:39,280 --> 00:02:41,919 Speaker 1: show is actually the full name of the whole thing. 43 00:02:42,560 --> 00:02:45,240 Speaker 1: And as we'll see, it's kind of difficult to translate 44 00:02:45,360 --> 00:02:50,880 Speaker 1: into English, right, and it took many, many years after 45 00:02:51,040 --> 00:02:54,200 Speaker 1: he wrote this article for it to be translated into 46 00:02:54,240 --> 00:02:57,240 Speaker 1: English for anybody even try to attempt it. So basically, 47 00:02:57,480 --> 00:03:02,400 Speaker 1: Mori was this robotic cyst and he wrote this essay 48 00:03:02,440 --> 00:03:04,240 Speaker 1: and at the time he just put it out there 49 00:03:04,320 --> 00:03:08,520 Speaker 1: and went back to work, started teaching more and more roboticists. 50 00:03:08,520 --> 00:03:11,480 Speaker 1: The whole new generation of roboticists learned under him and 51 00:03:11,560 --> 00:03:15,519 Speaker 1: his his work just kind of sat there. Um unobserved 52 00:03:15,680 --> 00:03:18,560 Speaker 1: that article, i should say, and then in two thousand 53 00:03:18,720 --> 00:03:23,040 Speaker 1: five a rough translation of it was leaked out wasn't 54 00:03:23,080 --> 00:03:28,280 Speaker 1: intended for publication, and the world entirely changed, right because 55 00:03:28,360 --> 00:03:32,680 Speaker 1: Massa Hiro Morty had in his article put his finger 56 00:03:32,760 --> 00:03:35,960 Speaker 1: on something that no one had before in his capacities 57 00:03:35,960 --> 00:03:39,840 Speaker 1: a roboticist and a human and that was what we 58 00:03:39,960 --> 00:03:45,680 Speaker 1: call today the uncanny Valley. Yeah, so that's um the 59 00:03:45,760 --> 00:03:49,960 Speaker 1: idea that, uh, you're making a robot and we'll see 60 00:03:49,960 --> 00:03:52,680 Speaker 1: this apply some more than just robots. But in his case, 61 00:03:52,720 --> 00:03:54,360 Speaker 1: you're making a robot and you want to make it 62 00:03:54,400 --> 00:03:59,440 Speaker 1: look like a person um, which I guess not all roboticists, 63 00:04:00,160 --> 00:04:02,440 Speaker 1: some of them like the clunky jets and style robots 64 00:04:03,320 --> 00:04:06,200 Speaker 1: like Rosy. But I guess if you're Mori, you're you're 65 00:04:06,320 --> 00:04:10,040 Speaker 1: on the path to designing lifelike robots. And the closer 66 00:04:10,120 --> 00:04:14,520 Speaker 1: you get to that lifelike look, everything's going great. Everything's 67 00:04:14,560 --> 00:04:16,440 Speaker 1: going great, people are like, this is so cool, this 68 00:04:16,520 --> 00:04:18,679 Speaker 1: is so cool, and then all of a sudden, people 69 00:04:18,680 --> 00:04:24,200 Speaker 1: go oh like right as it approaches it's most or basically, 70 00:04:24,200 --> 00:04:28,280 Speaker 1: when it reaches its most lifelike capacity, that this whoever's 71 00:04:28,320 --> 00:04:33,520 Speaker 1: making it can conjure people are repulsed by it. Yeah, 72 00:04:33,560 --> 00:04:37,080 Speaker 1: which is something that most people who ever hear of 73 00:04:37,160 --> 00:04:40,159 Speaker 1: the Uncanny Valley are like, yeah, you know, that's I've 74 00:04:40,200 --> 00:04:43,000 Speaker 1: noticed that. That's happened to me before too. But the 75 00:04:43,000 --> 00:04:45,960 Speaker 1: thing is, Chuck, it doesn't it doesn't actually make sense, right, Like, 76 00:04:46,000 --> 00:04:50,280 Speaker 1: we know a robot is a robot. Yeah, so you know, 77 00:04:50,480 --> 00:04:52,280 Speaker 1: maybe you could be afraid that it's gonna like pick 78 00:04:52,360 --> 00:04:54,240 Speaker 1: you up and break you in two or something like 79 00:04:54,279 --> 00:04:57,400 Speaker 1: a cartoon, but that's different than being creeped out by 80 00:04:57,520 --> 00:04:59,560 Speaker 1: like why would we be creeped out by a robot? 81 00:04:59,600 --> 00:05:02,200 Speaker 1: And this is what Morey put his finger on, was 82 00:05:02,960 --> 00:05:06,480 Speaker 1: there's something to this and it doesn't make sense, and 83 00:05:07,000 --> 00:05:10,520 Speaker 1: he he didn't. It wasn't even just um this article 84 00:05:10,560 --> 00:05:13,720 Speaker 1: that he wrote. He created a graph as well that's 85 00:05:13,720 --> 00:05:18,039 Speaker 1: become quite famous that um really kind of gets the 86 00:05:18,040 --> 00:05:20,599 Speaker 1: point across more than anything else. Yeah, And he wasn't 87 00:05:20,640 --> 00:05:23,000 Speaker 1: even the first person to to go over this and 88 00:05:23,040 --> 00:05:26,640 Speaker 1: to put a put some thought to it. Freud of course, 89 00:05:26,680 --> 00:05:30,200 Speaker 1: because he'd like to think about everything. He thought about 90 00:05:30,200 --> 00:05:33,240 Speaker 1: a little bit. And before Freud, there was a a 91 00:05:33,360 --> 00:05:38,400 Speaker 1: Geman name anst yinched. Oh nice. I did not realize 92 00:05:38,400 --> 00:05:41,520 Speaker 1: that's how his last name should be pronounced. That's good stuff. 93 00:05:41,560 --> 00:05:42,719 Speaker 1: I think I put a tea on the end, but 94 00:05:42,760 --> 00:05:46,440 Speaker 1: the teas in the middle, yinch. Yeah, I think that's right. 95 00:05:46,839 --> 00:05:49,600 Speaker 1: I've been saying, Gentch. We don't. We'll have to look 96 00:05:49,600 --> 00:05:51,400 Speaker 1: that up. Then. I think you're no. I think of 97 00:05:51,480 --> 00:05:54,680 Speaker 1: the two of us, you're you got the German down. Uh. 98 00:05:54,720 --> 00:05:58,880 Speaker 1: And he had a little term called um heimlich uh 99 00:05:58,920 --> 00:06:01,400 Speaker 1: that he called it so like you know, different languages 100 00:06:01,400 --> 00:06:04,680 Speaker 1: had different names for it. Um and you go back 101 00:06:04,680 --> 00:06:07,200 Speaker 1: in time, all the way back to like the seventeenth century, 102 00:06:07,680 --> 00:06:10,560 Speaker 1: and people were and I guess you know, robots didn't 103 00:06:10,560 --> 00:06:13,160 Speaker 1: look super lifelike back then, but whatever their version of 104 00:06:13,200 --> 00:06:17,200 Speaker 1: life like was. Uh. In the sixteen hundreds, people were like, 105 00:06:18,000 --> 00:06:20,360 Speaker 1: I don't like that. Why is it looking at me? Yeah, 106 00:06:20,480 --> 00:06:23,280 Speaker 1: it's got a quill and it's writing things. But like 107 00:06:23,320 --> 00:06:27,719 Speaker 1: you said, Maury made this graph because he was a roboticist, 108 00:06:27,800 --> 00:06:29,479 Speaker 1: and he thought, you know, let's look at this on 109 00:06:29,520 --> 00:06:33,359 Speaker 1: a plotted out so we can stare at it. And 110 00:06:33,560 --> 00:06:37,560 Speaker 1: on the X axis he had human likeness. Then on 111 00:06:37,600 --> 00:06:40,160 Speaker 1: the y axis he had affinity, like whether or not 112 00:06:40,400 --> 00:06:43,479 Speaker 1: you like the way this thing looks. And just as 113 00:06:43,680 --> 00:06:46,240 Speaker 1: we're talking about um, the graph went up and up 114 00:06:46,640 --> 00:06:49,600 Speaker 1: as uh, things got more lifelike and people like the 115 00:06:49,640 --> 00:06:52,200 Speaker 1: way look And then at a certain point there's that 116 00:06:52,360 --> 00:06:56,720 Speaker 1: valley there's a big dip. Uh. That really just kind 117 00:06:56,720 --> 00:07:01,080 Speaker 1: of says it all right, And again this all makes 118 00:07:01,080 --> 00:07:04,400 Speaker 1: sense intuitively, but as we'll see, that's it's been very 119 00:07:04,440 --> 00:07:07,400 Speaker 1: difficult to prove. And one of the reasons why it's 120 00:07:07,520 --> 00:07:12,760 Speaker 1: confounded research thus far is because we were not even 121 00:07:13,600 --> 00:07:17,240 Speaker 1: sure what more meant by some of the words he chose, 122 00:07:17,320 --> 00:07:19,880 Speaker 1: at least as far as translating them to English. Right. 123 00:07:20,680 --> 00:07:26,480 Speaker 1: Um For example, boukimi. Right, it was translated in two 124 00:07:26,560 --> 00:07:31,040 Speaker 1: thousand five as uncanny, but um again that that original 125 00:07:31,080 --> 00:07:34,600 Speaker 1: translation was not intended for publication, but it leaked out, 126 00:07:35,120 --> 00:07:39,760 Speaker 1: and so Uncanny Valley became, you know, the the way 127 00:07:39,800 --> 00:07:42,000 Speaker 1: we all think of it here in the West. But 128 00:07:42,120 --> 00:07:47,160 Speaker 1: boukimi more closely resembles something like eerie. Like I've seen 129 00:07:47,160 --> 00:07:51,960 Speaker 1: it explain that, Um, a word like boukimi means more 130 00:07:52,080 --> 00:07:55,960 Speaker 1: than uncanny is just weird or remarkable or noteworthy. It's 131 00:07:55,960 --> 00:07:58,960 Speaker 1: not necessarily something that gives you the creeps. Boukimi is 132 00:07:59,000 --> 00:08:05,720 Speaker 1: something that that gives you the creeps, like Steve Bukimi's exactly. Um. 133 00:08:05,840 --> 00:08:10,440 Speaker 1: So bukimi probably more should be should have been translated 134 00:08:10,480 --> 00:08:14,240 Speaker 1: the Eerie Valley. But by the time an actual official 135 00:08:14,320 --> 00:08:19,000 Speaker 1: translation that um Mori signed off on came out in 136 00:08:19,040 --> 00:08:21,520 Speaker 1: two thousand twelve, the cat was out of the bag. 137 00:08:21,560 --> 00:08:24,040 Speaker 1: Everybody knew of it as the Uncanny Valley, and there's 138 00:08:24,080 --> 00:08:26,360 Speaker 1: no way anybody who's gonna come back and be like, no, no, no, 139 00:08:26,400 --> 00:08:32,040 Speaker 1: everybody stop calling it that. It's now the Eerie Valley. Okay, right, 140 00:08:32,520 --> 00:08:34,120 Speaker 1: all right, And it may be one of those things 141 00:08:34,160 --> 00:08:36,480 Speaker 1: where we're so used to uncanny Valley now that it's 142 00:08:36,480 --> 00:08:38,760 Speaker 1: hard to imagine eerie Valley. But right, I think that 143 00:08:38,840 --> 00:08:41,800 Speaker 1: was the issue. Yeah, Like, nobody's gonna go along with that. 144 00:08:42,960 --> 00:08:45,320 Speaker 1: So this graph, like I said, it starts off on 145 00:08:45,360 --> 00:08:47,679 Speaker 1: that left hand side, and this is where you have 146 00:08:47,760 --> 00:08:51,920 Speaker 1: things that are super robotic. Um like you know, a 147 00:08:51,960 --> 00:08:58,000 Speaker 1: packaging robot in a factory, um that you know, apparently 148 00:08:58,000 --> 00:09:00,360 Speaker 1: most people don't have funness for I do because I 149 00:09:00,400 --> 00:09:05,280 Speaker 1: love mechanical processes, um right, right, Okay, So there's there's 150 00:09:05,320 --> 00:09:08,480 Speaker 1: part of the problem. It's like that's not necessarily the 151 00:09:09,040 --> 00:09:12,800 Speaker 1: kind of feeling that massa Hiro Morty was talking about. 152 00:09:12,920 --> 00:09:16,440 Speaker 1: He was like, yeah, yeah, you're interested in robotics and 153 00:09:16,559 --> 00:09:19,760 Speaker 1: robotic arms and the industrial processes, and you love watching 154 00:09:19,800 --> 00:09:22,920 Speaker 1: how it's made. Right. What he was talking about was 155 00:09:23,000 --> 00:09:26,800 Speaker 1: more like how it resembles a human and then how 156 00:09:26,840 --> 00:09:29,800 Speaker 1: it makes you feel in relation to its resemblance of 157 00:09:29,800 --> 00:09:32,680 Speaker 1: a human. Right. Well, in that case, it makes me 158 00:09:32,720 --> 00:09:35,280 Speaker 1: feel nothing because it doesn't look at all like a human. Right, Okay, 159 00:09:35,320 --> 00:09:37,520 Speaker 1: So that would be at about the origin of the graph. 160 00:09:37,640 --> 00:09:41,040 Speaker 1: It has no resemblance to a human really, and it's 161 00:09:41,080 --> 00:09:44,440 Speaker 1: not a listening any real affinity in you at all 162 00:09:45,280 --> 00:09:47,520 Speaker 1: as far as it looking like a human, right, but 163 00:09:47,600 --> 00:09:51,280 Speaker 1: lots of affinity as a thing that's just that's called props. 164 00:09:52,440 --> 00:09:54,480 Speaker 1: So you go a little bit further on the graph 165 00:09:54,600 --> 00:09:58,120 Speaker 1: and then you have things like um little stuffed animals 166 00:09:58,160 --> 00:10:01,720 Speaker 1: and uh no. C three po is is a common 167 00:10:01,760 --> 00:10:05,440 Speaker 1: one that's mentioned because C three p O UM, you know, 168 00:10:05,800 --> 00:10:08,000 Speaker 1: is built to look like a human. He does a 169 00:10:08,080 --> 00:10:11,320 Speaker 1: great robot. It talks like a human and acts like 170 00:10:11,360 --> 00:10:13,840 Speaker 1: a human. But when it comes to that face, and 171 00:10:13,840 --> 00:10:15,400 Speaker 1: as we'll see, the face is kind of the key 172 00:10:15,400 --> 00:10:18,600 Speaker 1: to all this um. For the most part, C three 173 00:10:18,600 --> 00:10:20,800 Speaker 1: po looks nothing like a human in the face. So 174 00:10:20,880 --> 00:10:23,959 Speaker 1: everything is still good and people love C three po. Right, 175 00:10:24,040 --> 00:10:26,520 Speaker 1: So if you're looking at the graph, C three p 176 00:10:26,640 --> 00:10:29,679 Speaker 1: o is going up in human likeness because he kind 177 00:10:29,720 --> 00:10:32,800 Speaker 1: of you know, he's got some some commonality there, and 178 00:10:33,520 --> 00:10:37,440 Speaker 1: we're feeling affinity for him based on that human likeness. 179 00:10:37,520 --> 00:10:42,600 Speaker 1: So it's he's going up. Okay, We're going everything's going 180 00:10:42,720 --> 00:10:45,760 Speaker 1: pretty well so far, right, Chuck, that's right, Okay, So 181 00:10:45,800 --> 00:10:48,880 Speaker 1: then we're gonna start hitting some areas where things start 182 00:10:48,920 --> 00:10:51,559 Speaker 1: looking a little more human, a lot more human, I 183 00:10:51,600 --> 00:10:53,960 Speaker 1: would say than c three p o, like say the 184 00:10:54,120 --> 00:10:58,720 Speaker 1: characters in mo Wanna or Frozen, uh picks our characters 185 00:10:58,760 --> 00:11:01,880 Speaker 1: that kind of thing where where they look like they're 186 00:11:01,880 --> 00:11:04,720 Speaker 1: supposed to be human, like they're based on human, but 187 00:11:04,800 --> 00:11:08,160 Speaker 1: they have very exaggerated features that you would never confuse 188 00:11:08,240 --> 00:11:11,640 Speaker 1: at first glance for an actual human. Right, So they 189 00:11:11,640 --> 00:11:14,160 Speaker 1: have like bi a guys, small noses, things that make 190 00:11:14,200 --> 00:11:18,640 Speaker 1: them cute, right, And so our affinity for them is 191 00:11:18,679 --> 00:11:21,600 Speaker 1: going up as the human likeness is going up. Again, 192 00:11:21,800 --> 00:11:24,600 Speaker 1: things are going really well so far, that's right, because 193 00:11:24,640 --> 00:11:27,200 Speaker 1: in Mowanna and Frozen they look a little bit more 194 00:11:27,240 --> 00:11:29,080 Speaker 1: like people, and we like them a lot more for 195 00:11:29,160 --> 00:11:33,600 Speaker 1: that reason. And then, like you said earlier, out of nowhere, 196 00:11:34,200 --> 00:11:36,480 Speaker 1: the whole thing, this line that's just been going up 197 00:11:36,600 --> 00:11:42,520 Speaker 1: very pleasantly, and a nice little slope just drops downward, right, 198 00:11:42,559 --> 00:11:46,000 Speaker 1: and it doesn't drop just downward, it goes actually below 199 00:11:46,160 --> 00:11:50,760 Speaker 1: the X axis into negative territory. And now this is 200 00:11:50,760 --> 00:11:54,439 Speaker 1: the Uncanny Valley, that's right, And that's why it has 201 00:11:54,480 --> 00:11:56,920 Speaker 1: that name because it's a valley, right, And this is 202 00:11:56,960 --> 00:12:01,400 Speaker 1: where those things like really really life like androids live, 203 00:12:02,320 --> 00:12:09,439 Speaker 1: or um corpses live, or zombies live. Because Maury he 204 00:12:09,440 --> 00:12:13,920 Speaker 1: he had the idea that if something's moving, is even 205 00:12:14,040 --> 00:12:17,360 Speaker 1: creepier than something similar to it that's not moving. So 206 00:12:17,400 --> 00:12:20,920 Speaker 1: he actually created two lines on this graph, one for 207 00:12:21,040 --> 00:12:23,840 Speaker 1: things that are animate and one for things that are inanimate. 208 00:12:24,080 --> 00:12:27,000 Speaker 1: So if you look at this uncanny valley on the 209 00:12:27,120 --> 00:12:30,679 Speaker 1: inanimate line, the non moving line, you've got corpses are 210 00:12:30,720 --> 00:12:33,199 Speaker 1: at the bottom of it. But if you look at 211 00:12:33,200 --> 00:12:37,120 Speaker 1: the animate line, it's even it dips even further below 212 00:12:37,160 --> 00:12:39,480 Speaker 1: than the inanimate line, and at the bottom of those 213 00:12:39,480 --> 00:12:42,800 Speaker 1: are zombies. So dead people up and moving around and 214 00:12:42,840 --> 00:12:46,360 Speaker 1: saying brains is as creepy as it gets as far 215 00:12:46,400 --> 00:12:49,520 Speaker 1: as this graph is concerned. Yeah, and he Maria wasn't 216 00:12:49,520 --> 00:12:52,360 Speaker 1: the only one that um earns yinch that we talked about. 217 00:12:52,440 --> 00:12:56,280 Speaker 1: The German psychiatrists. Uh. He also talked about the fact 218 00:12:56,280 --> 00:12:58,800 Speaker 1: that if you are looking at something that should not 219 00:12:58,840 --> 00:13:01,480 Speaker 1: be moving and it moves. Um, I mean, I think 220 00:13:01,480 --> 00:13:04,640 Speaker 1: we can all agree that a baby doll that suddenly 221 00:13:04,720 --> 00:13:07,520 Speaker 1: turns its head and looks at you probably one of 222 00:13:07,520 --> 00:13:10,440 Speaker 1: the creepier things you can witness, right, you know, Yeah, 223 00:13:10,440 --> 00:13:12,680 Speaker 1: it's about as creepy as it gets or um, have 224 00:13:12,800 --> 00:13:17,480 Speaker 1: you ever been to an open casket funeral? A few? 225 00:13:17,840 --> 00:13:22,280 Speaker 1: I'm not a fan at all. No, it is it 226 00:13:22,440 --> 00:13:25,720 Speaker 1: makes sense. You know, we've really kind of closed or 227 00:13:25,760 --> 00:13:28,160 Speaker 1: put a lot of space in between us and death, 228 00:13:28,200 --> 00:13:29,720 Speaker 1: way more than we used to have in like the 229 00:13:29,800 --> 00:13:33,640 Speaker 1: nineteenth century sit up with the dead. Sure, right, so 230 00:13:33,679 --> 00:13:35,880 Speaker 1: this seems to be like kind of a holdover from that. 231 00:13:36,480 --> 00:13:39,560 Speaker 1: But if you've ever been to uh an open casket 232 00:13:39,600 --> 00:13:42,960 Speaker 1: funeral and have just stared at the corpse long enough, 233 00:13:43,000 --> 00:13:46,000 Speaker 1: like maybe it's arm or it's fingers or something, your 234 00:13:46,040 --> 00:13:49,000 Speaker 1: brain is so anticipating that they it's about to start 235 00:13:49,040 --> 00:13:51,720 Speaker 1: moving that sometimes you can creep yourself out and and 236 00:13:51,760 --> 00:13:54,760 Speaker 1: make yourself think you did actually see it move. You'll 237 00:13:54,760 --> 00:13:57,200 Speaker 1: also be asked to leave the funeral. Well you shouldn't 238 00:13:57,200 --> 00:13:59,840 Speaker 1: be like giving a commentary about this out loud, but 239 00:14:00,080 --> 00:14:02,000 Speaker 1: you can. You know, you can do it to pass 240 00:14:02,040 --> 00:14:06,360 Speaker 1: the time in the funeral if you're looking to to 241 00:14:06,440 --> 00:14:10,199 Speaker 1: kill some time. Uh So, here's the thing with all this, Um, 242 00:14:10,360 --> 00:14:13,080 Speaker 1: we know this happens because everyone kind of has this feeling, 243 00:14:13,640 --> 00:14:16,199 Speaker 1: but no one and all this research has been done, 244 00:14:16,200 --> 00:14:20,200 Speaker 1: and no one is exactly sure why this happens. So, uh, 245 00:14:20,520 --> 00:14:23,920 Speaker 1: Maury's essay and especially once it was translated, Um, a 246 00:14:23,960 --> 00:14:28,160 Speaker 1: lot of research started happening in this area. And uh, 247 00:14:28,280 --> 00:14:31,520 Speaker 1: it's problematic though, because there are there are a few 248 00:14:31,560 --> 00:14:36,840 Speaker 1: different problems. One is, well, it's it's subjective. This dependent variable, 249 00:14:36,880 --> 00:14:40,520 Speaker 1: whether you have an affinity for something is very subjective, 250 00:14:40,640 --> 00:14:44,400 Speaker 1: so it's hard to kind of nail that down scientifically, right, 251 00:14:44,640 --> 00:14:48,480 Speaker 1: all right, So the number two is human likeness, right, 252 00:14:48,600 --> 00:14:55,760 Speaker 1: this is the independent variable. And if you have human likeness, 253 00:14:55,800 --> 00:14:58,200 Speaker 1: like what does that mean? Like what looks human? What 254 00:14:58,280 --> 00:15:02,040 Speaker 1: doesn't look human? But we haven't pinned that down. So like, 255 00:15:02,080 --> 00:15:04,880 Speaker 1: if you can't pin the dependent variable down and the 256 00:15:04,920 --> 00:15:08,320 Speaker 1: independent variable down, it makes it really tough to study correct. 257 00:15:09,000 --> 00:15:11,440 Speaker 1: And then there's a third one too. I love this one. Yeah. 258 00:15:11,480 --> 00:15:14,480 Speaker 1: The third one is, uh, you know, the original hypothesis. 259 00:15:14,520 --> 00:15:19,360 Speaker 1: It doesn't have a mathematical model that like really specifies 260 00:15:19,440 --> 00:15:24,960 Speaker 1: the shape of this curve, so it's still hypothetical, I guess, right, 261 00:15:25,360 --> 00:15:28,680 Speaker 1: Which means that so if you look at Mori's graph, 262 00:15:28,760 --> 00:15:31,000 Speaker 1: it was he just basically made a line, right, It 263 00:15:31,040 --> 00:15:33,760 Speaker 1: wasn't based on any studies he'd done. The whole thing 264 00:15:33,840 --> 00:15:38,080 Speaker 1: was really an essay more than anything else. UM. So, 265 00:15:38,480 --> 00:15:42,320 Speaker 1: researchers who are trying to seriously study the scientifically have 266 00:15:42,560 --> 00:15:47,040 Speaker 1: nothing that they're actually trying to place their findings against, 267 00:15:47,360 --> 00:15:50,520 Speaker 1: which leads to uh put it puts them at risk 268 00:15:50,560 --> 00:15:54,520 Speaker 1: for what's called the Texas sharpshooter fallacy. Greatest name fallacy 269 00:15:54,560 --> 00:15:58,040 Speaker 1: around and um it's it's based on the idea that 270 00:15:58,440 --> 00:16:00,920 Speaker 1: if you take a sharpshooter in Texas and have them 271 00:16:00,920 --> 00:16:03,280 Speaker 1: shoot at the side of a barn a bunch of times, 272 00:16:03,600 --> 00:16:06,360 Speaker 1: some of them are inevitably going to hit the barn, 273 00:16:06,760 --> 00:16:09,240 Speaker 1: and then the Texas sharpshooter walks up and then draws 274 00:16:09,280 --> 00:16:12,720 Speaker 1: the bull's eye around the bullets that he already sunk 275 00:16:12,760 --> 00:16:15,480 Speaker 1: into the side of the barn. That's the Texas sharpshooter fallacy. 276 00:16:15,560 --> 00:16:18,200 Speaker 1: It's ignoring data like the ones where he missed the 277 00:16:18,240 --> 00:16:21,320 Speaker 1: barn in favor of ones that fall into what you're 278 00:16:21,360 --> 00:16:24,560 Speaker 1: looking for, the bullet holes in the barn. You could 279 00:16:24,560 --> 00:16:28,360 Speaker 1: do the same thing with the um data that you 280 00:16:28,400 --> 00:16:31,080 Speaker 1: get from testing the Uncanny Valley if you have no 281 00:16:31,200 --> 00:16:34,480 Speaker 1: model to fit it into already. Yeah, I think they 282 00:16:34,480 --> 00:16:36,600 Speaker 1: would have done better if they would have just instead 283 00:16:36,600 --> 00:16:41,600 Speaker 1: of trying to prove something, uh, to maybe just research 284 00:16:41,680 --> 00:16:45,040 Speaker 1: and call it a thought experiment, you know, right, But 285 00:16:45,160 --> 00:16:47,680 Speaker 1: people are taking it seriously and we'll we'll talk about 286 00:16:47,720 --> 00:17:16,639 Speaker 1: some of this research right after this, chuck. Alright, So 287 00:17:16,680 --> 00:17:21,679 Speaker 1: we're back. And despite the fact that this is really 288 00:17:21,720 --> 00:17:24,280 Speaker 1: tough to study, it's not even established that it's a 289 00:17:24,320 --> 00:17:28,080 Speaker 1: real thing in everyone's mind. By the way, Um, there 290 00:17:28,119 --> 00:17:31,879 Speaker 1: there are people out there who are really studying the 291 00:17:32,040 --> 00:17:35,040 Speaker 1: Uncanny Valley and trying to pin it down. Yeah. One 292 00:17:35,040 --> 00:17:38,320 Speaker 1: of these people is at Dartmouth College psychologists. And I 293 00:17:38,359 --> 00:17:46,160 Speaker 1: didn't look up there mascot, the the the pub darts, 294 00:17:46,440 --> 00:17:49,040 Speaker 1: the fighting pub darts at Dartmouth College. We're gonna hear 295 00:17:49,080 --> 00:17:52,520 Speaker 1: from Dartmouth. But her name is Talia Wheatley. Um. And 296 00:17:52,560 --> 00:17:54,480 Speaker 1: she's done some research and has found that it's not 297 00:17:54,560 --> 00:17:59,560 Speaker 1: just like some uniquely Western thing or American thing. It's 298 00:17:59,600 --> 00:18:01,760 Speaker 1: kind of all over the world. She studied tribes and 299 00:18:01,840 --> 00:18:06,919 Speaker 1: Cambodia and they have the same sensitivities to these things 300 00:18:06,960 --> 00:18:09,920 Speaker 1: that look human but aren't human. Uh. And they've even 301 00:18:09,960 --> 00:18:12,400 Speaker 1: found that and I think it kind of all comes 302 00:18:12,400 --> 00:18:14,960 Speaker 1: down to the eyes. But they found just looking at 303 00:18:14,960 --> 00:18:19,320 Speaker 1: the eye can be enough. Yeah, Um, somebody can tell 304 00:18:19,320 --> 00:18:21,840 Speaker 1: whether it's a human or not just looking at a 305 00:18:21,840 --> 00:18:24,879 Speaker 1: picture of the eye, right. Yeah. And that's where I 306 00:18:24,920 --> 00:18:29,520 Speaker 1: think people lose credibility. And and we'll talk about movies 307 00:18:29,560 --> 00:18:32,280 Speaker 1: and sculpture and all that stuff, but they just never 308 00:18:32,400 --> 00:18:34,640 Speaker 1: get you can't get the eyes right, Like, you can't 309 00:18:34,680 --> 00:18:39,760 Speaker 1: put life in lifeless eyes? How are they try? Only 310 00:18:39,800 --> 00:18:44,800 Speaker 1: God can? Uh? And there was this other experiment where they, um, 311 00:18:45,600 --> 00:18:48,240 Speaker 1: you know, like where you can morph a face, uh 312 00:18:48,280 --> 00:18:52,119 Speaker 1: digitally or whatever like that Michael Jackson Black or White video. Yeah. 313 00:18:52,240 --> 00:18:54,000 Speaker 1: I think some people are creeped out by that even 314 00:18:55,160 --> 00:18:57,760 Speaker 1: but they would show this dull image and it would 315 00:18:57,800 --> 00:19:00,480 Speaker 1: morph into a human face. And basically they would have 316 00:19:00,520 --> 00:19:04,080 Speaker 1: people mark where they where they thought that it would 317 00:19:04,720 --> 00:19:07,240 Speaker 1: look more human than doll and it you know, it 318 00:19:07,320 --> 00:19:11,680 Speaker 1: landed about the mark as far as morphing into human, 319 00:19:12,000 --> 00:19:14,639 Speaker 1: which I mean you can't really apply that necessarily, but 320 00:19:14,800 --> 00:19:20,560 Speaker 1: just it's interesting. Offhand point is about where the Uncanny 321 00:19:20,640 --> 00:19:24,199 Speaker 1: Valley happened in Maury's mind. Yeah, I would think it 322 00:19:24,200 --> 00:19:27,879 Speaker 1: would be higher than that, but um, yeah, it's still 323 00:19:27,880 --> 00:19:31,280 Speaker 1: super interesting. Um. And you were saying that the eyes, 324 00:19:32,200 --> 00:19:34,199 Speaker 1: and that's what you're betting on is that it's going 325 00:19:34,240 --> 00:19:36,840 Speaker 1: to turn out to be the eyes, right. So, trying 326 00:19:36,880 --> 00:19:42,200 Speaker 1: to investigate what constitutes human likeness, there's a researcher named 327 00:19:42,240 --> 00:19:46,120 Speaker 1: Angela Tinwell, and she basically says, like, yes, it's all 328 00:19:46,160 --> 00:19:51,160 Speaker 1: about the upper facial features and that we we detect those, 329 00:19:51,200 --> 00:19:55,200 Speaker 1: we we read those, and so if there's any anything 330 00:19:55,240 --> 00:19:58,640 Speaker 1: that's even just slightly off in like you know, the eyes, 331 00:19:58,880 --> 00:20:03,520 Speaker 1: or the brows or the wrinkles, that form um that 332 00:20:03,600 --> 00:20:07,280 Speaker 1: will lead to the uncanny valley that's the creeping part 333 00:20:07,720 --> 00:20:10,240 Speaker 1: or the smile too. She also says, well, yeah, and 334 00:20:10,280 --> 00:20:12,320 Speaker 1: all these things kind of come down to evolution, and 335 00:20:12,400 --> 00:20:15,480 Speaker 1: her point is like, you can't battle millions of years 336 00:20:15,520 --> 00:20:19,800 Speaker 1: of evolution that his honed are our dumb, little human 337 00:20:19,840 --> 00:20:24,400 Speaker 1: brain to detect something that's off about a face. Uh, 338 00:20:24,600 --> 00:20:28,960 Speaker 1: it's just too much to overcome. Basically, this other researcher 339 00:20:29,440 --> 00:20:35,080 Speaker 1: named Carl F. McDorman, who's from the University of Indiana, 340 00:20:35,160 --> 00:20:40,359 Speaker 1: who actually he's basically like dedicated his career to this. Now. Um, 341 00:20:40,440 --> 00:20:45,639 Speaker 1: he found that certain kinds of people if you do 342 00:20:45,760 --> 00:20:51,000 Speaker 1: like a personality inventory before testing for uncanny Valley sensitivity, 343 00:20:51,400 --> 00:20:55,399 Speaker 1: some types of people are predictably more sensitive to the 344 00:20:55,520 --> 00:21:00,000 Speaker 1: Uncanny Valley than others. Specifically, he found that very religious 345 00:21:00,080 --> 00:21:06,560 Speaker 1: people that makes total sense. Yeah, neurotic people, um, and uh, 346 00:21:06,800 --> 00:21:12,680 Speaker 1: people with high sensitivity to animal reminders. It's basically anything 347 00:21:13,080 --> 00:21:17,040 Speaker 1: that reminds you that, hey, you're super civilized and you 348 00:21:17,160 --> 00:21:20,399 Speaker 1: drive a car and you know how to play poker. Um, 349 00:21:20,440 --> 00:21:22,760 Speaker 1: but you're still an animal, just as much as that 350 00:21:22,880 --> 00:21:27,040 Speaker 1: ape over there is an animal reminder. Um. People who 351 00:21:27,040 --> 00:21:29,679 Speaker 1: are sensitive to that kind of thing tend to go 352 00:21:29,760 --> 00:21:32,080 Speaker 1: off on the uncanny Valley as well. And then people 353 00:21:32,080 --> 00:21:36,240 Speaker 1: who are anxious are more likely to experience the uncanny 354 00:21:36,320 --> 00:21:39,320 Speaker 1: Valley as far as McDorman is concerned. Yeah, that interesting 355 00:21:39,359 --> 00:21:41,600 Speaker 1: makes sense too, because they're probably just more prone to 356 00:21:42,600 --> 00:21:46,159 Speaker 1: be I don't know, just have a reaction to a 357 00:21:46,160 --> 00:21:48,680 Speaker 1: lot of things like this. Right. But but we should 358 00:21:48,720 --> 00:21:51,960 Speaker 1: say the science and all this, the fact that the 359 00:21:52,080 --> 00:21:56,240 Speaker 1: independent and the dependent variable are still not defined, the 360 00:21:56,359 --> 00:22:00,080 Speaker 1: science is this. This is like the scientific equivalent of 361 00:22:00,160 --> 00:22:04,200 Speaker 1: that backward over the head half court basketball shot. Yeah, 362 00:22:04,400 --> 00:22:06,720 Speaker 1: that's the level of science that these people are carrying 363 00:22:06,720 --> 00:22:09,679 Speaker 1: out at this point because they're they're a lot of 364 00:22:09,720 --> 00:22:14,320 Speaker 1: them sadly are conducting experiments based on something that again 365 00:22:14,440 --> 00:22:18,440 Speaker 1: doesn't have a set dependent or independent variable. So how 366 00:22:18,480 --> 00:22:20,920 Speaker 1: can you do that? As my question, Well, yeah, I 367 00:22:20,960 --> 00:22:23,560 Speaker 1: mean because in each experiment, they're going to be using 368 00:22:24,440 --> 00:22:30,720 Speaker 1: different uh stimuli, um, different faces, whether it's a doll 369 00:22:30,880 --> 00:22:33,879 Speaker 1: or a wax figure or a c g I character, 370 00:22:34,040 --> 00:22:36,520 Speaker 1: And then they're gonna be doing different things and have 371 00:22:36,560 --> 00:22:39,960 Speaker 1: different expressions, and each person has their own subjective takes, 372 00:22:40,000 --> 00:22:43,040 Speaker 1: so it is very tough thing to kind of nail down. Yeah, 373 00:22:43,040 --> 00:22:44,919 Speaker 1: And I think some of them are actually trying to 374 00:22:45,040 --> 00:22:48,200 Speaker 1: form the basis of this field of study right now, 375 00:22:48,240 --> 00:22:52,240 Speaker 1: they're doing the groundwork. But I think some of them 376 00:22:52,280 --> 00:22:54,840 Speaker 1: also are like just chasing headlines, Like there's no better 377 00:22:54,880 --> 00:22:57,560 Speaker 1: way to get get into the media cycle with your 378 00:22:57,600 --> 00:23:01,000 Speaker 1: study than than releasing some five endings on the Uncanny 379 00:23:01,080 --> 00:23:06,919 Speaker 1: Valley people just love. Uh. One thing I thought was 380 00:23:06,960 --> 00:23:10,400 Speaker 1: interesting was, um, at Princeton they tried this with monkeys 381 00:23:10,440 --> 00:23:13,080 Speaker 1: and they found the same thing happened when they had 382 00:23:13,119 --> 00:23:17,320 Speaker 1: these realistic looking but fake monkey faces. The monkeys were 383 00:23:17,440 --> 00:23:22,360 Speaker 1: like it turned away. Um. It did make me think though, 384 00:23:22,400 --> 00:23:26,159 Speaker 1: like all the you've seen these situations where like an 385 00:23:26,240 --> 00:23:30,040 Speaker 1: orphaned animal has a a creepy puppet mother. Yeah, I 386 00:23:30,119 --> 00:23:32,720 Speaker 1: know exactly what you're talking about, and they seem to 387 00:23:32,800 --> 00:23:34,879 Speaker 1: like that. But however, and this is a bit of 388 00:23:34,880 --> 00:23:38,200 Speaker 1: a spoiler, but um, towards the end of this article, 389 00:23:38,840 --> 00:23:41,679 Speaker 1: it points out that human babies don't have this reaction 390 00:23:41,760 --> 00:23:44,360 Speaker 1: at first either, and that it's kind of learned. So 391 00:23:44,640 --> 00:23:47,879 Speaker 1: maybe that explains it. Maybe with the animals, I know 392 00:23:47,960 --> 00:23:52,680 Speaker 1: you're talking about that that cage, Like, why are monkey mother? 393 00:23:53,560 --> 00:23:56,200 Speaker 1: It's super creepy? Is it? Black and white photo? Well, no, 394 00:23:56,320 --> 00:23:58,159 Speaker 1: I mean they do it. There's all kinds of animals. Well, 395 00:23:58,200 --> 00:24:00,240 Speaker 1: they'll they have like a fake tiger or a fake 396 00:24:00,880 --> 00:24:04,359 Speaker 1: duck or whatever, just so the animal will feed or 397 00:24:04,440 --> 00:24:06,320 Speaker 1: I mean it's usually an animal that that milks from 398 00:24:06,359 --> 00:24:09,920 Speaker 1: the mother, I guess. But um, it's a common thing 399 00:24:09,960 --> 00:24:14,080 Speaker 1: they do for orphaned milk feeding or breastfeed animals. And 400 00:24:14,119 --> 00:24:18,040 Speaker 1: they're always creepy. Huh. Well, I mean to us, but 401 00:24:18,160 --> 00:24:20,920 Speaker 1: to a dumb baby monkey, they're just like, sweet give 402 00:24:21,000 --> 00:24:26,639 Speaker 1: me the teat. There's a T shirt maybe even a 403 00:24:26,680 --> 00:24:30,720 Speaker 1: band name sweet give Me the Teat. Yeah, yeah, that 404 00:24:30,840 --> 00:24:34,840 Speaker 1: kind of falls into the long band named category. But 405 00:24:34,920 --> 00:24:38,560 Speaker 1: here's the thing is, not everyone agrees with this. The 406 00:24:38,960 --> 00:24:40,920 Speaker 1: whole thing like you said earlier, there's a man named 407 00:24:41,000 --> 00:24:44,720 Speaker 1: David Hanson and he's a roboticist as well, and plain 408 00:24:44,800 --> 00:24:49,240 Speaker 1: oh Texas, and uh. He did a very very basic study. 409 00:24:50,080 --> 00:24:53,680 Speaker 1: It was a survey where they showed images of two 410 00:24:53,680 --> 00:24:58,480 Speaker 1: different robots that were animated to simulate human facial expressions 411 00:24:58,760 --> 00:25:01,159 Speaker 1: and basically just asked, Hey, what do you think of 412 00:25:01,160 --> 00:25:07,240 Speaker 1: this and said I like them? Yeah. Can you see 413 00:25:07,240 --> 00:25:09,320 Speaker 1: why people have trouble with this study? Though? Yeah, he 414 00:25:09,359 --> 00:25:14,359 Speaker 1: said not one person said they were disturbed. Okay, sounds 415 00:25:14,359 --> 00:25:18,720 Speaker 1: good for the most part, though, Studies into the uncanny 416 00:25:18,800 --> 00:25:22,440 Speaker 1: valley or like, now, we we're finding something here, although 417 00:25:23,080 --> 00:25:26,320 Speaker 1: we should be suspicious of ones that basically show the 418 00:25:26,480 --> 00:25:31,239 Speaker 1: uncanny valley that Moriy just graphed out of his like 419 00:25:31,320 --> 00:25:34,639 Speaker 1: with freehand, Like if you if you've come across the 420 00:25:34,680 --> 00:25:38,480 Speaker 1: study that shows that same thing, they're probably cherry picking data, 421 00:25:38,600 --> 00:25:43,000 Speaker 1: we got to say, out of his butt. Maybe there 422 00:25:43,040 --> 00:25:48,160 Speaker 1: was another study Edward Schneider at Suny Potsdam in New York. 423 00:25:48,720 --> 00:25:51,719 Speaker 1: I bet they don't even have a mascot. They got 424 00:25:51,800 --> 00:25:58,960 Speaker 1: together characters from cartoons and video games, everyone from Mickey 425 00:25:59,040 --> 00:26:04,199 Speaker 1: Mouse to law Acroft and who is very attractive by 426 00:26:04,200 --> 00:26:10,919 Speaker 1: the way she's a computer character. Yeah. Well no, I'm 427 00:26:10,960 --> 00:26:14,560 Speaker 1: talking about playing tomb Rater. Oh I never played. Yeah, 428 00:26:14,600 --> 00:26:16,480 Speaker 1: when it first came out, you know, I played tomb 429 00:26:16,520 --> 00:26:18,520 Speaker 1: Rater and I was like, oh, I look at her 430 00:26:18,800 --> 00:26:22,480 Speaker 1: act kind of hut. She Well, she's she gets a 431 00:26:22,520 --> 00:26:27,320 Speaker 1: lot of stuff done. That's very attractive. That's true. That means, well, 432 00:26:27,480 --> 00:26:31,359 Speaker 1: she travels a lot, she's an independent person. Yeah, that's 433 00:26:31,359 --> 00:26:32,960 Speaker 1: what I meant. I was attracted to her mind in 434 00:26:33,040 --> 00:26:37,000 Speaker 1: her adventures. Uh So, anyway, they asked people in this study, Um, 435 00:26:37,400 --> 00:26:40,440 Speaker 1: how attractive do you think these characters are? Or how 436 00:26:40,600 --> 00:26:43,560 Speaker 1: repulsive do you think they are? And again there was 437 00:26:43,880 --> 00:26:46,560 Speaker 1: um a graph with a dip in it at a 438 00:26:46,640 --> 00:26:53,600 Speaker 1: certain point, as you would expect. Yep. Careful, careful, everybody. 439 00:26:54,760 --> 00:26:57,920 Speaker 1: So if you're if you're a robot designer, right, one 440 00:26:57,960 --> 00:27:00,639 Speaker 1: of the things like even back in his essay written 441 00:27:00,640 --> 00:27:07,159 Speaker 1: in nine seventy, Masahiro Morey said, um, there's there's problems 442 00:27:07,200 --> 00:27:10,320 Speaker 1: here with movement. There's problems here with the smile. It 443 00:27:10,440 --> 00:27:14,480 Speaker 1: has something to do with the face, right, Um, and 444 00:27:14,920 --> 00:27:17,000 Speaker 1: it's somebody else said I don't remember who it was, 445 00:27:17,080 --> 00:27:19,360 Speaker 1: but there's there always seems to be a lag time 446 00:27:19,440 --> 00:27:23,399 Speaker 1: between how realistic a designer can make a robot and 447 00:27:23,600 --> 00:27:29,560 Speaker 1: how realistic a an engineer can make that robot look right, 448 00:27:30,040 --> 00:27:33,200 Speaker 1: and that that disconnected. Maury's mind was a big part 449 00:27:33,280 --> 00:27:35,240 Speaker 1: of the on Canny Valley, but he also seemed to 450 00:27:35,280 --> 00:27:39,760 Speaker 1: focus on the smile on the eyes and one of 451 00:27:39,840 --> 00:27:42,600 Speaker 1: the things that's at stake, like besides this just being 452 00:27:42,680 --> 00:27:46,959 Speaker 1: like an interesting topic of discussion, like, there are actual 453 00:27:47,280 --> 00:27:50,719 Speaker 1: real world implications for this whole thing, right, Like, if 454 00:27:50,720 --> 00:27:54,960 Speaker 1: you're a robot designer, you want to create something that's 455 00:27:55,000 --> 00:27:57,960 Speaker 1: not going to freak people out, because the whole purpose 456 00:27:58,040 --> 00:28:01,720 Speaker 1: of robots is to interact with humans, and you want 457 00:28:01,800 --> 00:28:04,040 Speaker 1: them to interact with humans. I should say life like 458 00:28:04,680 --> 00:28:08,399 Speaker 1: looking at robots, right, because like four motor companies ever 459 00:28:08,480 --> 00:28:12,160 Speaker 1: gonna buy an android that looks human to just work 460 00:28:12,240 --> 00:28:14,600 Speaker 1: on an assembly line when they can get the same 461 00:28:14,720 --> 00:28:17,560 Speaker 1: thing that does the same job cheaper when it just 462 00:28:17,680 --> 00:28:21,040 Speaker 1: looks like a robotic arm or something. Right, the whole 463 00:28:21,160 --> 00:28:24,760 Speaker 1: purpose of a lifelike looking robots because that robot is 464 00:28:24,800 --> 00:28:27,720 Speaker 1: being designed to interact with humans. And if you are 465 00:28:27,800 --> 00:28:30,960 Speaker 1: going to run into this spot, some people say it's 466 00:28:31,000 --> 00:28:33,600 Speaker 1: not even a valley. Some people think it's insurmountable a 467 00:28:33,680 --> 00:28:36,159 Speaker 1: cliff or a wall. So if you're gonna run up 468 00:28:36,160 --> 00:28:38,920 Speaker 1: against this, you want to figure out how to overcome 469 00:28:38,960 --> 00:28:40,920 Speaker 1: it because you don't want to creep people out with 470 00:28:41,040 --> 00:28:43,800 Speaker 1: your creations. Well, and you don't want to spend a 471 00:28:43,840 --> 00:28:49,160 Speaker 1: lot of money, um to develop a robotic Walmart greeter 472 00:28:49,920 --> 00:28:54,480 Speaker 1: at every store because it's it's happening like this is 473 00:28:54,560 --> 00:28:58,440 Speaker 1: coming people. Yeah, there's a robot called Geminoid F or 474 00:28:58,600 --> 00:29:01,360 Speaker 1: acterroid f depends on who you ask. I've also seen 475 00:29:01,400 --> 00:29:04,400 Speaker 1: he called Ellie, and she is out of this lab 476 00:29:04,520 --> 00:29:09,080 Speaker 1: by a guy named Hiroshi Ishiguro, and he is probably 477 00:29:09,160 --> 00:29:12,800 Speaker 1: the world's leading roboticist. If you've seen any life like android, 478 00:29:13,360 --> 00:29:16,760 Speaker 1: it probably came out of this guy's lab, right, And 479 00:29:17,080 --> 00:29:19,600 Speaker 1: she is starting to get out there in the world. 480 00:29:19,720 --> 00:29:23,920 Speaker 1: She's been a debriefer of soldiers coming back from more 481 00:29:24,040 --> 00:29:28,360 Speaker 1: with PTSD, based on the idea that they might share 482 00:29:28,440 --> 00:29:30,680 Speaker 1: more with a robot that they knew was just a 483 00:29:30,840 --> 00:29:34,560 Speaker 1: robot than they would an actual human. Um. She's in 484 00:29:34,640 --> 00:29:40,840 Speaker 1: a play. She stars as an android, right. And then 485 00:29:40,880 --> 00:29:45,520 Speaker 1: there's Casper. There's a little robot called Casper. Yeah, Casper 486 00:29:45,800 --> 00:29:49,960 Speaker 1: is a robot boy with a great cause created to 487 00:29:50,000 --> 00:29:54,520 Speaker 1: help children with autism learn to read facial emotions. If 488 00:29:54,560 --> 00:29:57,400 Speaker 1: you look up photos of both of these giminoid, f 489 00:29:58,280 --> 00:30:03,360 Speaker 1: uh looks great and really like Ishiguro is doing great, 490 00:30:03,400 --> 00:30:09,800 Speaker 1: great work. Casper looks terrifying, right, and so Casper is creepy. 491 00:30:10,080 --> 00:30:13,800 Speaker 1: But that's not his purpose at all, right, his purpose 492 00:30:13,920 --> 00:30:18,240 Speaker 1: is to like teach kids with autism how to connect. 493 00:30:18,720 --> 00:30:23,120 Speaker 1: But if he's repelling them through this uncanny valley, he's 494 00:30:23,160 --> 00:30:26,160 Speaker 1: defeating the purpose. Well, they should go to Ishi Gurro 495 00:30:26,920 --> 00:30:29,400 Speaker 1: and say, hey, we have this great cause, can you 496 00:30:29,480 --> 00:30:32,400 Speaker 1: make us something that doesn't look like the stuff of nightmares? 497 00:30:32,920 --> 00:30:36,920 Speaker 1: Right exactly? I wonder if Casper has been um effective, 498 00:30:37,280 --> 00:30:41,120 Speaker 1: you know, I don't know. I don't know. Now I 499 00:30:41,160 --> 00:30:44,440 Speaker 1: feel bad I didn't look into that. Well, I just 500 00:30:44,720 --> 00:30:47,800 Speaker 1: I don't know. He's very creepy looking. I agree wholeheartedly. 501 00:30:47,920 --> 00:30:50,760 Speaker 1: It's kind of like, no, he's not finished, get back 502 00:30:50,800 --> 00:30:54,160 Speaker 1: to the drawing board. Either that or and this is 503 00:30:54,240 --> 00:30:57,280 Speaker 1: what Morey said, like go the other way, like just 504 00:30:57,440 --> 00:31:01,520 Speaker 1: make him not um he himan at all, just cute 505 00:31:01,640 --> 00:31:06,160 Speaker 1: or approachable. So the robotists are not the only ones 506 00:31:06,200 --> 00:31:09,760 Speaker 1: who are facing this chuck. There is a a pretty 507 00:31:09,880 --> 00:31:15,400 Speaker 1: powerful moneyed contingent of people who are interested stakeholders in 508 00:31:15,640 --> 00:31:18,479 Speaker 1: overcoming the uncanny Valley, or at least figuring out if 509 00:31:18,560 --> 00:31:24,000 Speaker 1: it's totally insurmountable. And that is Hollywood. Yeah. Um, Hollywood 510 00:31:24,040 --> 00:31:27,600 Speaker 1: has a sort of a rich history of getting it 511 00:31:27,720 --> 00:31:32,600 Speaker 1: wrong when it comes to creepy c g I characters. Um. 512 00:31:33,480 --> 00:31:35,880 Speaker 1: Pixar had their very first short film. Actually it's called 513 00:31:35,920 --> 00:31:38,960 Speaker 1: Tin Toy, uh, a little five minutes short, and they 514 00:31:39,000 --> 00:31:41,320 Speaker 1: showed it to this you know, this proceded toy story 515 00:31:41,360 --> 00:31:43,560 Speaker 1: and everything. Yeah, it was actually kind of like the 516 00:31:44,000 --> 00:31:47,160 Speaker 1: outline of toy stories plot. Yeah, but they showed it 517 00:31:47,240 --> 00:31:50,080 Speaker 1: to test audiences and they made the mistake of making 518 00:31:50,960 --> 00:31:56,920 Speaker 1: the baby Billy look too realistic, and everyone loved Tin 519 00:31:57,000 --> 00:32:00,480 Speaker 1: Toy and everyone hated Billy. Yeah have you seen it? 520 00:32:00,960 --> 00:32:05,320 Speaker 1: Yeah yeah, yeah, he's pretty hateable for sure, and he 521 00:32:05,440 --> 00:32:08,720 Speaker 1: has the antagonist, but he he struck some chord with 522 00:32:08,880 --> 00:32:12,320 Speaker 1: viewers that that Pixar did not mean to strike. And 523 00:32:12,400 --> 00:32:15,840 Speaker 1: they actually, I mean, this is extraordinarily fortunate for Pixarre. 524 00:32:16,480 --> 00:32:19,760 Speaker 1: This is very early on in their history. And um, 525 00:32:20,280 --> 00:32:24,240 Speaker 1: they they learned from it. Actually they're like, Okay, note 526 00:32:24,280 --> 00:32:27,240 Speaker 1: to self, don't try to make any of these characters life. Like, 527 00:32:27,360 --> 00:32:30,200 Speaker 1: let's go a different direction, and so they came up 528 00:32:30,240 --> 00:32:35,880 Speaker 1: with those um exaggerated features that we've all just come 529 00:32:35,960 --> 00:32:39,440 Speaker 1: to know and love. Yeah, which was a great, great 530 00:32:39,520 --> 00:32:42,600 Speaker 1: direction to go in, obviously, because they've had tons of 531 00:32:42,640 --> 00:32:45,200 Speaker 1: success with that model. Right, you can make the case 532 00:32:45,280 --> 00:32:49,080 Speaker 1: that it may have saved the company because other companies 533 00:32:49,120 --> 00:32:52,520 Speaker 1: and other movies, for sure, have not been nearly as fortunate. Yeah. 534 00:32:52,880 --> 00:32:56,520 Speaker 1: One of the first big photo real computer animated movies 535 00:32:56,600 --> 00:33:00,720 Speaker 1: was Final Fantasy Colon the Spirits Within. You should never 536 00:33:00,800 --> 00:33:02,480 Speaker 1: have a colon in your movie title, by the way, 537 00:33:03,120 --> 00:33:06,080 Speaker 1: that was the first mistake. But this one was from 538 00:33:06,120 --> 00:33:10,160 Speaker 1: two thousand one and based on the video game, and 539 00:33:10,480 --> 00:33:13,600 Speaker 1: it was off putting to a lot of people, and 540 00:33:13,680 --> 00:33:16,880 Speaker 1: it was a big, big bomb for Columbia Pictures. And 541 00:33:17,040 --> 00:33:20,560 Speaker 1: but this is before Uncanny Valley had really been established, 542 00:33:20,800 --> 00:33:25,720 Speaker 1: before Maury's essay was translated, so reviewers didn't quite know 543 00:33:25,880 --> 00:33:28,160 Speaker 1: what to say. Now they would just say we've tumbled 544 00:33:28,200 --> 00:33:31,440 Speaker 1: into the Uncanny Valley again, but they would say things 545 00:33:31,520 --> 00:33:35,800 Speaker 1: like Peter Traver's great reviewer from Rolling Stone said, at first, 546 00:33:35,840 --> 00:33:41,680 Speaker 1: it's fun to watch the characters. Ellipsis ellipsips But what's 547 00:33:41,720 --> 00:33:43,880 Speaker 1: an ellipsis is that two of them, couple of them. 548 00:33:44,960 --> 00:33:46,840 Speaker 1: But then you notice a coldness in the eyes, a 549 00:33:46,920 --> 00:33:50,560 Speaker 1: mechanical quality in the movements, familiar voices emerging from the 550 00:33:50,640 --> 00:33:55,040 Speaker 1: mouths of replicants erect a distance. Yeah, so he's describing 551 00:33:55,080 --> 00:33:57,320 Speaker 1: young Canny Valley. He just didn't have the name of 552 00:33:57,440 --> 00:34:00,600 Speaker 1: it yet. Um. And then a couple of years later 553 00:34:01,120 --> 00:34:05,320 Speaker 1: the Polar Express, which became I think even more famous 554 00:34:05,400 --> 00:34:09,200 Speaker 1: than Final Fantasy as far as the Uncanny Valley goes. 555 00:34:09,560 --> 00:34:12,520 Speaker 1: But again, it's like you said, you know, the reviewers 556 00:34:12,600 --> 00:34:14,640 Speaker 1: didn't know how quite to put their finger on it, 557 00:34:15,160 --> 00:34:18,560 Speaker 1: and I'm not quite sure how Final Fantasy was done, 558 00:34:18,600 --> 00:34:23,239 Speaker 1: but I know that polar Express used similar um software 559 00:34:23,280 --> 00:34:29,239 Speaker 1: and hardware to what roboticists are using now, where it's 560 00:34:29,320 --> 00:34:32,719 Speaker 1: like motion capture, but rather than translating the motion to 561 00:34:33,080 --> 00:34:36,600 Speaker 1: the robot, it's translating the motion into like a digital 562 00:34:36,920 --> 00:34:42,240 Speaker 1: three D rendering of the character. Right, So polar Express 563 00:34:42,360 --> 00:34:47,120 Speaker 1: was really really expressive, but not quite there, so it 564 00:34:47,239 --> 00:34:50,520 Speaker 1: fell really hard in the Uncanny Valley, and um, I 565 00:34:50,640 --> 00:34:55,880 Speaker 1: think David Germaine of the Associated Press uh compared the 566 00:34:56,040 --> 00:35:00,400 Speaker 1: kids in this heartwarming family Christmas movie to the Children 567 00:35:00,480 --> 00:35:03,479 Speaker 1: from Village of the Damned, which is not what you want. 568 00:35:04,040 --> 00:35:06,200 Speaker 1: It's not at all what the studio wanted, and I 569 00:35:06,320 --> 00:35:08,959 Speaker 1: think it lost a pretty decent amount of money. Yeah, 570 00:35:09,320 --> 00:35:11,480 Speaker 1: there was another one of uh. And these are all, 571 00:35:11,719 --> 00:35:15,080 Speaker 1: by the way, courtesy of Robert Zemecas he really had 572 00:35:15,239 --> 00:35:19,719 Speaker 1: his He went all in on this technology. I don't 573 00:35:19,800 --> 00:35:22,960 Speaker 1: know why. I think he just I think sometimes you 574 00:35:23,080 --> 00:35:25,920 Speaker 1: as an artist, you can get so wrapped up and 575 00:35:26,840 --> 00:35:29,000 Speaker 1: the coolness of wow, look what we can do now 576 00:35:29,960 --> 00:35:33,360 Speaker 1: that you don't step back and look at what you're doing, like, 577 00:35:33,560 --> 00:35:38,120 Speaker 1: should should we be doing this? Because he also had 578 00:35:38,160 --> 00:35:41,359 Speaker 1: a part in the Bowolf movie in two thousand seven 579 00:35:41,400 --> 00:35:43,640 Speaker 1: that was a huge bomb. Um in The New York 580 00:35:43,719 --> 00:35:45,719 Speaker 1: Times said this about that people who are meant to 581 00:35:45,760 --> 00:35:48,600 Speaker 1: be enraged, who are at risk of plummeting to their deaths, 582 00:35:49,040 --> 00:35:51,279 Speaker 1: just look a little out of sorts. When it was over, 583 00:35:51,440 --> 00:35:53,880 Speaker 1: I felt relieved to be back in the company of 584 00:35:54,080 --> 00:35:59,319 Speaker 1: un creepy flesh and blood humans again. Sad and then uh, 585 00:35:59,800 --> 00:36:03,080 Speaker 1: just the Adventures of Tintin. Yeah, I really liked Tintin, 586 00:36:03,160 --> 00:36:06,520 Speaker 1: though I did too. I think Spielberg, I mean, there 587 00:36:06,760 --> 00:36:10,720 Speaker 1: is that Uncanny Valley a little bit, but the story 588 00:36:10,800 --> 00:36:13,560 Speaker 1: in the movie were so good he overcame that. I 589 00:36:13,640 --> 00:36:17,279 Speaker 1: think I was about to say, I think Spielberg has 590 00:36:17,320 --> 00:36:23,480 Speaker 1: come the closest to to overcoming that chasm of anybody. 591 00:36:23,719 --> 00:36:25,919 Speaker 1: But did he do it through good storytelling or through 592 00:36:26,120 --> 00:36:28,759 Speaker 1: the eyes I I don't know. I don't know if 593 00:36:28,840 --> 00:36:30,680 Speaker 1: it I don't know if it was a combination of 594 00:36:30,760 --> 00:36:35,000 Speaker 1: the two. Um, I don't know, but it is extraordinarily, 595 00:36:36,520 --> 00:36:39,680 Speaker 1: it's an extraordit. So you know those that that that 596 00:36:39,840 --> 00:36:41,719 Speaker 1: stuff you'll see every once a while, which somebody does 597 00:36:41,800 --> 00:36:44,200 Speaker 1: like what Beavis and butt Head would actually look like 598 00:36:44,320 --> 00:36:48,560 Speaker 1: in real life or what Charlie Brown was in real life. Right, 599 00:36:49,160 --> 00:36:53,200 Speaker 1: So it's still has kind of got a cartoonish quality 600 00:36:53,280 --> 00:36:55,760 Speaker 1: to it. It's the same thing with the Tintin movie. 601 00:36:56,320 --> 00:36:59,000 Speaker 1: But it was like it was as if you were 602 00:36:59,120 --> 00:37:03,480 Speaker 1: living in a dementia and where humans looked somewhat cartoonish. 603 00:37:04,840 --> 00:37:06,680 Speaker 1: Is that making any sense or does that just make 604 00:37:06,760 --> 00:37:09,040 Speaker 1: the whole thing even harder to understand? No? I get that. 605 00:37:09,480 --> 00:37:12,840 Speaker 1: So so he somehow was like, here, I'm not trying 606 00:37:12,920 --> 00:37:16,600 Speaker 1: to nail what humans look like. I'm going to take 607 00:37:16,640 --> 00:37:19,120 Speaker 1: you to another world where these people live and if 608 00:37:19,160 --> 00:37:21,319 Speaker 1: you lived in this world, you would look like this too. 609 00:37:21,400 --> 00:37:24,040 Speaker 1: It's it's weird. It's like he he bridged an uncanny 610 00:37:24,120 --> 00:37:26,959 Speaker 1: valley that doesn't exist in this dimension. Yeah, he built 611 00:37:27,000 --> 00:37:32,279 Speaker 1: a temporary disintegrating bridge across the uncanny valley. I think 612 00:37:32,320 --> 00:37:34,799 Speaker 1: the biggest example in recent years was, or the one 613 00:37:34,840 --> 00:37:37,480 Speaker 1: that got the most attention was in Rogue One. Did 614 00:37:37,560 --> 00:37:39,680 Speaker 1: you see that? The Star Wars movie? I haven't a 615 00:37:39,680 --> 00:37:41,840 Speaker 1: seen any of the new Star Wars ones except for 616 00:37:42,040 --> 00:37:46,239 Speaker 1: a seen the first six, I guess, but none of 617 00:37:46,280 --> 00:37:50,279 Speaker 1: the two new new ones. Uh. Well, in Rogue One, 618 00:37:50,640 --> 00:37:54,480 Speaker 1: they completely bring back to life Grand Moth Tarkin, who 619 00:37:55,080 --> 00:37:59,520 Speaker 1: was played by the deceased Peter Cushing, and they brought 620 00:37:59,600 --> 00:38:03,840 Speaker 1: him back as a character in this movie and in 621 00:38:03,960 --> 00:38:07,520 Speaker 1: the theater like when he when it first happens, he's 622 00:38:07,560 --> 00:38:09,320 Speaker 1: got his back to you and it's sort of in 623 00:38:09,360 --> 00:38:13,200 Speaker 1: the shadows and you're like, oh wow, like that's pretty cool. 624 00:38:13,400 --> 00:38:15,600 Speaker 1: And I didn't know that they would do that, but 625 00:38:15,760 --> 00:38:20,520 Speaker 1: they they got too comfortable, I think, and uh showed 626 00:38:20,560 --> 00:38:23,640 Speaker 1: too much and gave him too many lines and too 627 00:38:23,719 --> 00:38:28,480 Speaker 1: much light, and then it became uncanny Valley. Oh yeah, 628 00:38:28,600 --> 00:38:31,640 Speaker 1: for sure. I think about poor Peter Cushing's family having 629 00:38:31,719 --> 00:38:34,560 Speaker 1: to see that. Yeah, I don't know how often they 630 00:38:34,680 --> 00:38:37,479 Speaker 1: just weep during that that movie. Well, I'm curious about 631 00:38:37,520 --> 00:38:39,400 Speaker 1: like life rights and image rights and stuff like that, 632 00:38:39,480 --> 00:38:41,640 Speaker 1: if they had to get that cleared. I don't even 633 00:38:41,680 --> 00:38:44,400 Speaker 1: know that. I'm sure there's a backstory there. Cushing was 634 00:38:44,480 --> 00:38:48,200 Speaker 1: famously mellow that he would have taken a draw off 635 00:38:48,280 --> 00:38:51,360 Speaker 1: his DUBI and been like, that's whatever, man. Yeah, I 636 00:38:51,400 --> 00:38:53,200 Speaker 1: think he spent the last year of his life on 637 00:38:53,320 --> 00:38:58,200 Speaker 1: his weed farm in northern California. What about this Mars 638 00:38:58,360 --> 00:39:01,080 Speaker 1: Needs Moms. I had never ever heard of that movie, 639 00:39:01,160 --> 00:39:03,560 Speaker 1: and so I went and watched the trailer and it 640 00:39:03,719 --> 00:39:06,200 Speaker 1: still was like, I have no idea what this is. Yeah, 641 00:39:06,600 --> 00:39:09,719 Speaker 1: you know that comic strip bloom County. Well, you know, 642 00:39:09,840 --> 00:39:12,760 Speaker 1: I'm a huge, huge life along bloom County fan. Okay, 643 00:39:12,920 --> 00:39:15,520 Speaker 1: so Burke, so maybe you know how to say the 644 00:39:15,600 --> 00:39:21,719 Speaker 1: last name. It's Berkeley breathed, her breathed, But I don't 645 00:39:21,719 --> 00:39:23,560 Speaker 1: know if I've ever heard it said out loud breath. 646 00:39:23,600 --> 00:39:26,520 Speaker 1: It sounds nice. Let's go with that. So Berkeley brother, 647 00:39:26,719 --> 00:39:30,319 Speaker 1: the person, the guy who did bloom County. He wrote 648 00:39:30,360 --> 00:39:33,719 Speaker 1: a book, a children's book, called Mars Needs Moms, and 649 00:39:33,840 --> 00:39:37,400 Speaker 1: basically Mars had some sort of shortage of moms, so 650 00:39:37,760 --> 00:39:40,520 Speaker 1: the Martians came and kidnapped human moms and it was 651 00:39:40,600 --> 00:39:42,600 Speaker 1: up to the human kids to go get their moms 652 00:39:42,680 --> 00:39:45,719 Speaker 1: back from Mars. Right, pretty cute little premise, but they 653 00:39:45,800 --> 00:39:48,399 Speaker 1: took it and ran it through z Mecha's nightmare. Mill 654 00:39:49,360 --> 00:39:54,359 Speaker 1: Um Image Movers Digital was the was the trade name 655 00:39:54,400 --> 00:39:56,480 Speaker 1: of it, but everybody knew it's just steer clear of 656 00:39:56,560 --> 00:40:00,560 Speaker 1: this place, right, And this was like the Apex or 657 00:40:00,719 --> 00:40:04,480 Speaker 1: the what's the opposite of the Apex? The Valley, I guess, 658 00:40:04,560 --> 00:40:07,840 Speaker 1: so the deepest part of the c g I Valley 659 00:40:08,239 --> 00:40:12,000 Speaker 1: of the Uncanny Valley, right, it was what what the 660 00:40:12,040 --> 00:40:15,560 Speaker 1: stuff that they created? It was so off and just 661 00:40:15,680 --> 00:40:19,920 Speaker 1: so spectacularly and colostly off that when I guess Disney 662 00:40:20,000 --> 00:40:23,640 Speaker 1: came along and bought this company that came in looked 663 00:40:23,680 --> 00:40:27,120 Speaker 1: around and said, we're shutting you down. This movie is 664 00:40:27,120 --> 00:40:29,359 Speaker 1: that we're not doing this anymore. What you guys are 665 00:40:29,400 --> 00:40:33,200 Speaker 1: doing here is wrong, um, and you're all going to jail. Yeah, 666 00:40:33,280 --> 00:40:35,680 Speaker 1: here's my thoughts on that. I watched the trailer and 667 00:40:36,239 --> 00:40:38,319 Speaker 1: it didn't look any worse than any of the other 668 00:40:38,360 --> 00:40:40,759 Speaker 1: ones to me, and in fact I don't know the 669 00:40:40,840 --> 00:40:42,759 Speaker 1: character's names, but there's a kid and then there's this 670 00:40:42,840 --> 00:40:46,200 Speaker 1: one kind of chubby guy in Mars. The chubby guy 671 00:40:46,239 --> 00:40:49,160 Speaker 1: looked pretty good. Actually, I thought. I think this was 672 00:40:49,239 --> 00:40:52,800 Speaker 1: a victim. I bet the movie sucked really bad, and 673 00:40:52,920 --> 00:40:55,040 Speaker 1: I think it was the last straw the end of 674 00:40:55,120 --> 00:40:59,040 Speaker 1: all these Uncanny Valley failures. Yeah, because this again, this 675 00:40:59,200 --> 00:41:03,760 Speaker 1: is the same company that had created um Polar Express, 676 00:41:04,040 --> 00:41:07,000 Speaker 1: the Nightmare Factory, and a Christmas Carol did not do 677 00:41:07,239 --> 00:41:10,000 Speaker 1: very well either, So yeah, I think it. It definitely 678 00:41:10,120 --> 00:41:13,520 Speaker 1: bore the brunt of its predecessors as well. I but 679 00:41:13,640 --> 00:41:16,440 Speaker 1: I thought this was as bad as it got. If 680 00:41:16,480 --> 00:41:18,919 Speaker 1: you ask me, I totally saw what what Disney saw 681 00:41:19,560 --> 00:41:22,680 Speaker 1: with this one. Anytime something is marked as the thing 682 00:41:22,760 --> 00:41:26,399 Speaker 1: that killed the thing, right, it's always just the last thing. Yeah, 683 00:41:26,440 --> 00:41:29,640 Speaker 1: you're right, you know, yeah, but it could have also 684 00:41:29,719 --> 00:41:32,799 Speaker 1: been the thing that saved the thing had they gotten 685 00:41:32,840 --> 00:41:36,399 Speaker 1: it right. You know, that's true. So, like I said, 686 00:41:36,520 --> 00:41:38,799 Speaker 1: Mariy was like and every time I say Maury, now 687 00:41:38,880 --> 00:41:43,560 Speaker 1: unless I say it like Morty just saying Maury, think 688 00:41:43,640 --> 00:41:48,279 Speaker 1: I think of the Whigs, Salesman and Goodfellas is like 689 00:41:48,440 --> 00:41:51,440 Speaker 1: give me my money and ray leodis this in their 690 00:41:51,520 --> 00:41:55,120 Speaker 1: laughing because Morey's to pay falls off. Imagine that guy 691 00:41:55,280 --> 00:41:57,920 Speaker 1: is the guy who came up with the young Canny Valley. Okay, 692 00:41:58,239 --> 00:42:03,759 Speaker 1: it gives a full different spin too, right, So Maury says, um, 693 00:42:04,520 --> 00:42:07,160 Speaker 1: you just don't even try. Guys like you're never going 694 00:42:07,280 --> 00:42:09,560 Speaker 1: to be able to do this. Even if you can, 695 00:42:09,719 --> 00:42:11,800 Speaker 1: We're so far away from it. And this is the 696 00:42:12,480 --> 00:42:14,600 Speaker 1: vy he was saying it, and it still holds true. 697 00:42:14,680 --> 00:42:17,920 Speaker 1: Now we're so far away from this that that just 698 00:42:18,160 --> 00:42:22,240 Speaker 1: maybe put your put your emphasis elsewhere. And the example 699 00:42:22,320 --> 00:42:25,279 Speaker 1: he gave was say, like a prosthetic hand, right, rather 700 00:42:25,360 --> 00:42:28,719 Speaker 1: than trying to create a lifelike prosthetic hand, that that 701 00:42:29,000 --> 00:42:32,920 Speaker 1: was in danger of creeping people out, which is the 702 00:42:33,000 --> 00:42:36,480 Speaker 1: opposite of what somebody wearing a prosthetic hand wants when 703 00:42:36,520 --> 00:42:39,080 Speaker 1: they're walking around the prosthetic hand. He said, you know, 704 00:42:39,480 --> 00:42:44,160 Speaker 1: maybe choose some like like would well sanded beautifully grained 705 00:42:44,200 --> 00:42:47,319 Speaker 1: wood in the shape of a human hand. It gets 706 00:42:47,360 --> 00:42:49,839 Speaker 1: the point across this is my hand. I lost my hand, 707 00:42:49,880 --> 00:42:51,640 Speaker 1: I don't have my hand, but there's nothing to be 708 00:42:51,719 --> 00:42:54,480 Speaker 1: creeped out about here. It's kind of beautiful. Looking, isn't it. 709 00:42:55,040 --> 00:42:58,080 Speaker 1: That was Maury's take, and a lot of people side 710 00:42:58,160 --> 00:43:00,200 Speaker 1: with him as well. As a matter of fact, you know, 711 00:43:00,280 --> 00:43:02,239 Speaker 1: I said, I think at the beginning that he was 712 00:43:02,400 --> 00:43:06,680 Speaker 1: already an established roboticist when he wrote The Uncanny Valley 713 00:43:06,719 --> 00:43:09,759 Speaker 1: in NINEV and he went on to teach a lot 714 00:43:09,840 --> 00:43:13,399 Speaker 1: of people roboticists or a lot of roboticists as well. 715 00:43:13,760 --> 00:43:17,040 Speaker 1: And um, that very famous robot uh a s m 716 00:43:17,080 --> 00:43:21,000 Speaker 1: o Asmo, you know, the one I'm talking about. He 717 00:43:21,160 --> 00:43:23,280 Speaker 1: was one of the first ones that could jog in place. 718 00:43:23,520 --> 00:43:27,120 Speaker 1: And he's kind of humanoid for sure, but very cute, 719 00:43:27,320 --> 00:43:32,239 Speaker 1: all white, shiny lacquer plastic. You've seen him before. Um. 720 00:43:32,800 --> 00:43:36,560 Speaker 1: He was created by one of Morey's students, who clearly 721 00:43:36,640 --> 00:43:40,399 Speaker 1: subscribe to Maury's theory that you you're not gonna you're 722 00:43:40,440 --> 00:43:43,799 Speaker 1: not gonna overcome the Uncanny Valley. So just make these 723 00:43:43,880 --> 00:43:47,719 Speaker 1: things exaggerated and non human like, and you'll you'll have 724 00:43:47,880 --> 00:43:51,160 Speaker 1: people love your robot. Yeah. I think that's a good tech. Yeah, 725 00:43:51,719 --> 00:43:53,920 Speaker 1: all right, we're gonna take another break here and then 726 00:43:54,440 --> 00:43:57,160 Speaker 1: come back and finish up with a little bit. We're 727 00:43:57,160 --> 00:44:00,840 Speaker 1: gonna take a step back and just talk generally about creepiness, 728 00:44:22,200 --> 00:44:30,840 Speaker 1: all right, So I promised that we would talk about creepiness. 729 00:44:31,880 --> 00:44:36,360 Speaker 1: So that's what we'll do. The creeps such a great phrase. 730 00:44:36,680 --> 00:44:39,279 Speaker 1: Everyone says, it gives me the creeps. It's just such 731 00:44:39,360 --> 00:44:42,359 Speaker 1: a just It's one of those phrases that sums things 732 00:44:42,560 --> 00:44:47,120 Speaker 1: up so perfectly. It's livid as a fresh bruise. And um, 733 00:44:47,640 --> 00:44:51,240 Speaker 1: we have Charles Dickens to thank for this, evidently because 734 00:44:51,280 --> 00:44:54,280 Speaker 1: he gets credit for using the creeps. Uh In David 735 00:44:54,320 --> 00:44:59,560 Speaker 1: Copperfield in people had had this feeling before, this sort 736 00:44:59,600 --> 00:45:02,560 Speaker 1: of pleasn't off you know what it feels like to 737 00:45:02,600 --> 00:45:04,800 Speaker 1: get the creeps. But they said things like eel like 738 00:45:05,080 --> 00:45:08,120 Speaker 1: or clammy. Not bad, not bad. But if you said 739 00:45:08,840 --> 00:45:11,040 Speaker 1: that thing makes me feel eel like today, people be like, 740 00:45:11,200 --> 00:45:13,759 Speaker 1: what the heck are you talking about? Right? I think 741 00:45:13,800 --> 00:45:16,719 Speaker 1: also you would use that to describe somebody who gave 742 00:45:16,760 --> 00:45:18,960 Speaker 1: you the creeps as well, like that guy is really clammy, 743 00:45:20,480 --> 00:45:22,279 Speaker 1: you know what I mean? Sure? Well, that means you're 744 00:45:22,320 --> 00:45:25,280 Speaker 1: touching them though, like like Peter Lori would be clammy 745 00:45:25,400 --> 00:45:28,279 Speaker 1: or eel like in some of his characters. You know, 746 00:45:28,360 --> 00:45:33,800 Speaker 1: Peter Lourie, how do you too? Um, So everybody understands 747 00:45:33,840 --> 00:45:36,560 Speaker 1: that there is such thing as the creeps, right, but 748 00:45:38,120 --> 00:45:42,360 Speaker 1: we don't understand why we get the creeps still to 749 00:45:42,480 --> 00:45:44,759 Speaker 1: this day, and again this is important and relates it 750 00:45:44,800 --> 00:45:48,240 Speaker 1: on Canny Valley because another way to put the creeps 751 00:45:49,239 --> 00:45:53,719 Speaker 1: is negative affinity. Remember affinity was the x axis, and 752 00:45:53,880 --> 00:45:57,000 Speaker 1: when the valley dropped down below the x axis, you 753 00:45:57,120 --> 00:46:00,279 Speaker 1: dipped into negative affinity. You dip into the creep the 754 00:46:00,440 --> 00:46:06,160 Speaker 1: creeps exactly right. So you talked about um Ernst yinch Yeah, yeah, 755 00:46:06,239 --> 00:46:09,600 Speaker 1: I get it. It was probably the first person to 756 00:46:09,640 --> 00:46:15,040 Speaker 1: actually sit down and study the creeps or creepiness creepy himself. 757 00:46:15,440 --> 00:46:17,200 Speaker 1: I don't know. I think he was just kind of 758 00:46:17,280 --> 00:46:22,000 Speaker 1: a neat thinking man, right, So yinch Man, I like 759 00:46:22,120 --> 00:46:25,839 Speaker 1: saying his name a lot more now. He wrote an 760 00:46:26,000 --> 00:46:29,200 Speaker 1: essay in nineteen o six called on the Psychology of 761 00:46:29,320 --> 00:46:33,920 Speaker 1: the Uncanny and that's the English translation. Um. The German 762 00:46:34,040 --> 00:46:38,400 Speaker 1: word he used, like you said, is unheimlich, Is that right? Um? 763 00:46:38,600 --> 00:46:45,880 Speaker 1: Not unheimlich? Unheimlich Okay, better, thank you? Uh? He he 764 00:46:46,040 --> 00:46:49,680 Speaker 1: used that word, And unheimlich is a variation of the 765 00:46:49,719 --> 00:46:53,520 Speaker 1: word heimlich um, which is not just to say the maneuver, 766 00:46:54,040 --> 00:46:59,040 Speaker 1: It means something else entirely which is homie, we're familiar, right, 767 00:47:00,120 --> 00:47:04,000 Speaker 1: Unheimlich is the opposite of that. It's something strange and foreign, 768 00:47:04,360 --> 00:47:08,000 Speaker 1: and very frequently is translated into uncanny here in the West, 769 00:47:08,080 --> 00:47:13,359 Speaker 1: here here in in England. Yeah, and he he has, uh, 770 00:47:14,280 --> 00:47:15,880 Speaker 1: he thought a lot about this. And one of the 771 00:47:15,960 --> 00:47:18,200 Speaker 1: things that he noted, which I think thought was pretty interesting, 772 00:47:18,800 --> 00:47:23,399 Speaker 1: was that people that he thought were more intellectually discriminating, um, 773 00:47:24,160 --> 00:47:27,799 Speaker 1: are more prone to have these uncanny experiences because they're 774 00:47:27,880 --> 00:47:32,480 Speaker 1: critical thinkers about the world. Right, So, uh that makes sense, 775 00:47:32,560 --> 00:47:34,879 Speaker 1: Like just they pay attention maybe a little more, Yeah, 776 00:47:35,040 --> 00:47:37,200 Speaker 1: or they're curious, like they're they're like, why am I 777 00:47:37,280 --> 00:47:38,840 Speaker 1: creeped out? Let me get to the bottom of this, 778 00:47:39,000 --> 00:47:40,640 Speaker 1: rather than oh I'm creeped out him and then go 779 00:47:41,160 --> 00:47:44,000 Speaker 1: eat the whole thing at chips ahoy and hide under 780 00:47:44,040 --> 00:47:47,879 Speaker 1: the covers. Uh. He He also actually went even further 781 00:47:48,000 --> 00:47:52,360 Speaker 1: and said, it's it's possible that all of humanities knowledge 782 00:47:52,600 --> 00:47:56,759 Speaker 1: has been accrued over these millions of years from the 783 00:47:57,120 --> 00:48:02,320 Speaker 1: the people investigating what'spah this creepiness. It's a pretty weird 784 00:48:02,360 --> 00:48:06,400 Speaker 1: and neat theory of knowledge. Well, yeah, and speaking of theories, 785 00:48:06,400 --> 00:48:09,759 Speaker 1: there are a bunch of theories on creepiness, UM and 786 00:48:09,840 --> 00:48:12,000 Speaker 1: why this happens, and I think they're all pretty interesting. 787 00:48:12,800 --> 00:48:15,560 Speaker 1: The first one is called pathogen avoidance theory, and we 788 00:48:15,640 --> 00:48:19,440 Speaker 1: talked earlier about evolution and UM, this one kind of 789 00:48:19,480 --> 00:48:24,239 Speaker 1: fits into that bucket. Uh. Basically a warning that we 790 00:48:24,440 --> 00:48:28,360 Speaker 1: have evolved to have in our brain. It says that 791 00:48:28,600 --> 00:48:32,319 Speaker 1: person is off, they are diseased. Even you don't want 792 00:48:32,400 --> 00:48:35,520 Speaker 1: to go near them, you want to avoid that pathogen. 793 00:48:36,680 --> 00:48:40,879 Speaker 1: It makes sense. Yeah, it's pretty pretty approachable. Um. There's 794 00:48:40,880 --> 00:48:43,560 Speaker 1: another one that I've seen that I think fairly recent, 795 00:48:43,640 --> 00:48:48,560 Speaker 1: and it's the idea that things give us the creeps when, um, 796 00:48:49,280 --> 00:48:55,320 Speaker 1: when they're trying to nonverbally mimic people and so, like 797 00:48:55,440 --> 00:48:59,120 Speaker 1: a robot doesn't do it, so we're like, oh, that's unsettling. 798 00:48:59,560 --> 00:49:03,120 Speaker 1: Or somebody who you would describe as clammy or eel 799 00:49:03,160 --> 00:49:06,640 Speaker 1: like maybe overdes it a little bit, like they're trying 800 00:49:06,719 --> 00:49:10,239 Speaker 1: to fit in. It's not natural to them and that 801 00:49:10,520 --> 00:49:12,839 Speaker 1: can give you the creep as well. That makes sense, 802 00:49:12,960 --> 00:49:16,320 Speaker 1: but it doesn't really encompass everything. It's definitely not a 803 00:49:16,880 --> 00:49:19,480 Speaker 1: unified theory of creepiness. It just seems to kind of 804 00:49:19,560 --> 00:49:24,359 Speaker 1: inhabit one corner of the creepy spectrum. Yeah, there's another 805 00:49:24,400 --> 00:49:29,120 Speaker 1: one called violation of expectation. Um, this is like, you know, 806 00:49:29,280 --> 00:49:32,120 Speaker 1: you've shaken hands with thousands of people over your life. 807 00:49:32,560 --> 00:49:34,960 Speaker 1: But if you go and you shake a hand and 808 00:49:35,400 --> 00:49:37,680 Speaker 1: you don't know that you're going to get a prosthetic, 809 00:49:37,800 --> 00:49:41,279 Speaker 1: can it may give you the creeps? Uh? And that 810 00:49:41,880 --> 00:49:45,040 Speaker 1: is probably very fleeting because you might just say, oh, okay, 811 00:49:45,160 --> 00:49:47,120 Speaker 1: well it doesn't give me the creeps now, but it's 812 00:49:47,200 --> 00:49:50,480 Speaker 1: just unexpected for me. And actually you said that was fleeting, right, Chuck. 813 00:49:50,560 --> 00:49:54,320 Speaker 1: So I think it was uniche or somebody who said 814 00:49:55,120 --> 00:49:59,600 Speaker 1: that creepiness what gives us the creeps one time might 815 00:49:59,680 --> 00:50:02,480 Speaker 1: not of us the creeps later on, which will kind 816 00:50:02,520 --> 00:50:05,759 Speaker 1: of come into play later. Like Ernst n basically he 817 00:50:05,880 --> 00:50:09,080 Speaker 1: laid the groundwork for the study of creepiness and it 818 00:50:09,160 --> 00:50:11,520 Speaker 1: seems to have gotten a lot of it right right 819 00:50:11,560 --> 00:50:14,719 Speaker 1: out of the gate. Yeah. And like you said, if 820 00:50:14,760 --> 00:50:17,120 Speaker 1: it if it doesn't give you the creeps later, then 821 00:50:17,160 --> 00:50:21,160 Speaker 1: that would fit neatly into the violation of expectation because 822 00:50:22,120 --> 00:50:25,239 Speaker 1: then you can change your expectation right exactly. Yes. Yes, 823 00:50:25,960 --> 00:50:31,480 Speaker 1: another one's mortality salience theory. Yeah, this one, MORI and uh, 824 00:50:31,640 --> 00:50:35,480 Speaker 1: Freud both subscribed to and it basically said that, um, 825 00:50:36,200 --> 00:50:41,120 Speaker 1: we when we encounter like a robot or an automaton 826 00:50:41,320 --> 00:50:44,880 Speaker 1: in Freud's day, um, they remind us of dead people, 827 00:50:45,120 --> 00:50:47,279 Speaker 1: which in turn gets our mind to thinking about how 828 00:50:47,480 --> 00:50:49,400 Speaker 1: we're going to die one day. And so all of 829 00:50:49,440 --> 00:50:52,520 Speaker 1: a sudden we find ourselves in the uncanny valley right, 830 00:50:52,960 --> 00:50:56,240 Speaker 1: which again raises another sorry for the sidetrack, but raises 831 00:50:56,239 --> 00:51:01,960 Speaker 1: another of Unch's points. Um is uncanny nous inherent in 832 00:51:02,040 --> 00:51:05,920 Speaker 1: the object? Or is it inside the observer who's experiencing 833 00:51:06,000 --> 00:51:10,600 Speaker 1: the creeps or uncanny nous? I think it's in the observer. Yeah, 834 00:51:10,640 --> 00:51:12,680 Speaker 1: I think it is, too, which would explain why it 835 00:51:12,760 --> 00:51:16,400 Speaker 1: can go away when you when you come to experience 836 00:51:16,480 --> 00:51:19,520 Speaker 1: it again, like this, when you go through that, when 837 00:51:19,560 --> 00:51:22,720 Speaker 1: you shake the same prosthetic hand again, it's not creepy 838 00:51:22,840 --> 00:51:25,520 Speaker 1: the second time. It might even be interesting, or why 839 00:51:25,600 --> 00:51:29,000 Speaker 1: some people might not experience it at all. Like someone 840 00:51:29,120 --> 00:51:31,800 Speaker 1: might sit there and see a doll and the doll's 841 00:51:31,880 --> 00:51:34,000 Speaker 1: head turns and looks at them, and they're like, neat, 842 00:51:35,160 --> 00:51:37,520 Speaker 1: how much for that doll? Which means you've just met 843 00:51:37,560 --> 00:51:43,040 Speaker 1: a serial killer? And then the dolls creeped out after that, Uh, 844 00:51:43,200 --> 00:51:45,799 Speaker 1: this one, I like the UM even though I can 845 00:51:45,840 --> 00:51:51,640 Speaker 1: never say this word for some reason, and throw po morphism, dehumanization, dichotomy, 846 00:51:51,719 --> 00:51:55,320 Speaker 1: which basically as we attribute these human attributes to the 847 00:51:55,440 --> 00:52:00,680 Speaker 1: robot until we realize that they don't have them. Right. So, like, 848 00:52:00,800 --> 00:52:02,919 Speaker 1: we're looking at this robot that looks like a person. 849 00:52:03,000 --> 00:52:05,160 Speaker 1: We're saying, oh, look, it's just like a human. And 850 00:52:05,200 --> 00:52:07,080 Speaker 1: they're walking and they're talking and they're smiling, and then 851 00:52:07,120 --> 00:52:09,839 Speaker 1: oh god, look at their eyes. Their eyes are dead. 852 00:52:09,920 --> 00:52:12,080 Speaker 1: Look at the eyes. They don't they don't have any 853 00:52:12,200 --> 00:52:15,680 Speaker 1: internal thoughts at all. They're not human. And then all 854 00:52:15,680 --> 00:52:18,640 Speaker 1: of a sudden, on Canny Valley, which is a little 855 00:52:18,640 --> 00:52:22,080 Speaker 1: bit about expectation too. I think there's a crossover a little, 856 00:52:22,120 --> 00:52:27,840 Speaker 1: I think, and so creepiness, I think, especially the modern 857 00:52:28,040 --> 00:52:31,840 Speaker 1: incarnation of creepiness. This is my These are my thoughts. 858 00:52:32,320 --> 00:52:37,800 Speaker 1: They seem to be They represent a crossroads right where evolutionarily, 859 00:52:38,520 --> 00:52:43,480 Speaker 1: creepiness I think UM was probably it's alerts us. We're 860 00:52:43,520 --> 00:52:46,560 Speaker 1: on alert when something's creeping us out. We're really focused 861 00:52:46,600 --> 00:52:50,400 Speaker 1: on that thing, right then. But we're also bound by 862 00:52:50,520 --> 00:52:54,160 Speaker 1: society not to just turn and run from anything that 863 00:52:54,280 --> 00:52:57,080 Speaker 1: could conceivably be a threat. You can also take it 864 00:52:57,120 --> 00:53:00,719 Speaker 1: a little further and say that evolutionarily speaking, it would 865 00:53:00,760 --> 00:53:02,560 Speaker 1: not make sense for us to turn and run from 866 00:53:02,640 --> 00:53:05,359 Speaker 1: every single thing that could conceivably be a threat before 867 00:53:05,480 --> 00:53:07,759 Speaker 1: we've identified it as a threat, because we would be 868 00:53:07,840 --> 00:53:10,320 Speaker 1: using up a lot of calories and energy, and we 869 00:53:10,360 --> 00:53:12,239 Speaker 1: would have to find more food than we do. Would 870 00:53:12,239 --> 00:53:15,440 Speaker 1: be inefficient. Right, So we're kind of bound socially to 871 00:53:15,560 --> 00:53:19,120 Speaker 1: stand in place until we identify something as a threat 872 00:53:19,200 --> 00:53:23,319 Speaker 1: or not, in which case, during this period, that's when 873 00:53:23,360 --> 00:53:27,560 Speaker 1: we experience creepiness. Yeah, and I think everyone has experienced this. Um. 874 00:53:27,680 --> 00:53:30,360 Speaker 1: Like you're in a coffee shop or something and like 875 00:53:30,520 --> 00:53:34,920 Speaker 1: some super creepy dude comes in, and if you're liking me, 876 00:53:35,040 --> 00:53:38,399 Speaker 1: you're just like, um, all right, I'm gonna I'm gonna 877 00:53:38,480 --> 00:53:40,719 Speaker 1: keep my eye on that guy. I'm not I'm not 878 00:53:40,760 --> 00:53:44,160 Speaker 1: gonna bolt and run, but it might stay near the door. Sure, 879 00:53:44,440 --> 00:53:48,680 Speaker 1: you know, I might get my car keys ready, exactly right. Uh, 880 00:53:49,080 --> 00:53:52,520 Speaker 1: it is, it's this weird social contract, um, And you 881 00:53:52,600 --> 00:53:55,719 Speaker 1: know I feel bad for people that just inherently look 882 00:53:55,760 --> 00:53:59,640 Speaker 1: a little creepy. Well, yeah, let's talk about that. So 883 00:53:59,800 --> 00:54:02,520 Speaker 1: the was these there were these researchers from Knox College 884 00:54:02,760 --> 00:54:05,960 Speaker 1: who did what they build is the first empirical study 885 00:54:06,000 --> 00:54:09,839 Speaker 1: of creepiness. And this is in two thousand sixteen. And um, 886 00:54:10,040 --> 00:54:13,560 Speaker 1: it was an online survey, very little heavy lifting, but 887 00:54:13,680 --> 00:54:16,640 Speaker 1: it was a pretty pretty cool survey. It was in 888 00:54:16,760 --> 00:54:22,359 Speaker 1: four parts, and UM. What they found overall was that, Yeah, 889 00:54:22,680 --> 00:54:27,920 Speaker 1: physical characteristics, physical traits that are almost stereotypically linked to 890 00:54:28,280 --> 00:54:33,080 Speaker 1: creepy people do have an effect. They are creepy. Um, 891 00:54:33,840 --> 00:54:37,200 Speaker 1: as far as as the participants in this study are concerned. 892 00:54:38,320 --> 00:54:42,160 Speaker 1: So the first section said, hey, you know what, what 893 00:54:42,360 --> 00:54:45,440 Speaker 1: is the likelihood that this person is creepy? And there's like, 894 00:54:45,880 --> 00:54:50,799 Speaker 1: you know, descriptions of them with forty four different behaviors. Right. Um. 895 00:54:51,160 --> 00:54:54,680 Speaker 1: And the second part was participants rated the creepiness of 896 00:54:54,719 --> 00:55:01,279 Speaker 1: twenty one different occupations. Um. The third section it said, 897 00:55:01,400 --> 00:55:04,960 Speaker 1: list two hobbies that you think are creepy. They only 898 00:55:05,040 --> 00:55:07,759 Speaker 1: needed to it was open ended to. And then the 899 00:55:07,880 --> 00:55:11,520 Speaker 1: last section, um, the participants said whether or not they 900 00:55:11,560 --> 00:55:15,919 Speaker 1: agreed with fifteen statements about the nature of creepy people. Yeah. 901 00:55:16,520 --> 00:55:19,360 Speaker 1: And overall, again like they found like, yes, if you 902 00:55:19,480 --> 00:55:23,280 Speaker 1: have physical traits that people find creepy, like bulging eyes 903 00:55:23,480 --> 00:55:26,200 Speaker 1: or you lick your lips a lot, or you know, 904 00:55:26,400 --> 00:55:28,759 Speaker 1: your you arch your fingers and then just kind of 905 00:55:28,840 --> 00:55:33,440 Speaker 1: tap them together a lot. Okay, it's kind of creepy, 906 00:55:33,560 --> 00:55:38,200 Speaker 1: but the Knox researchers concluded that those aren't creepy necessarily 907 00:55:38,320 --> 00:55:41,080 Speaker 1: in and of themselves. It's when it's in conjunction with 908 00:55:41,239 --> 00:55:47,920 Speaker 1: other creepy behavior that somebody comes across as creepy. Right. Uh. 909 00:55:48,080 --> 00:55:50,279 Speaker 1: And of course, the one behavior they put in here 910 00:55:50,280 --> 00:55:53,359 Speaker 1: I think that was probably universally creepy was someone who 911 00:55:53,440 --> 00:55:57,919 Speaker 1: persistently steers the conversation towards a sexual topic, right, yeah, 912 00:55:58,000 --> 00:56:00,800 Speaker 1: you don't, you don't do that. They they also found 913 00:56:01,560 --> 00:56:05,320 Speaker 1: they also found of participants, and this is like, I 914 00:56:05,400 --> 00:56:10,439 Speaker 1: think eight hundred and forty one people of them said 915 00:56:10,480 --> 00:56:13,440 Speaker 1: that men were more likely to be creepy than women. 916 00:56:14,560 --> 00:56:18,680 Speaker 1: I think that's generally true. Um. I don't remember getting 917 00:56:18,680 --> 00:56:22,600 Speaker 1: the creeps a lot in my life by uh, strictly 918 00:56:23,040 --> 00:56:26,040 Speaker 1: from the appearance of a woman, right, but a lot 919 00:56:26,200 --> 00:56:28,320 Speaker 1: of dudes on a weekly basis give me the creeps. 920 00:56:29,400 --> 00:56:31,759 Speaker 1: But we we should say so. There's a website called 921 00:56:31,840 --> 00:56:35,160 Speaker 1: girl dot com g u r l dot com and 922 00:56:35,360 --> 00:56:37,400 Speaker 1: they went on to read it and found a thread 923 00:56:37,480 --> 00:56:40,160 Speaker 1: somewhere that they wrote a blog post about and now 924 00:56:40,239 --> 00:56:43,520 Speaker 1: we're reporting on it, so it's really come full circle. 925 00:56:44,520 --> 00:56:47,240 Speaker 1: But it was a threat about how women can be creepy, 926 00:56:47,400 --> 00:56:51,359 Speaker 1: and it was written by dudes, and um, there are 927 00:56:51,480 --> 00:56:57,480 Speaker 1: some things that apparently are universally creepy among boys with women, right, 928 00:56:58,560 --> 00:57:02,359 Speaker 1: Women that are too needy can be creepy. Women who 929 00:57:02,480 --> 00:57:06,279 Speaker 1: use baby talk too much, or who quote never leave 930 00:57:06,320 --> 00:57:10,239 Speaker 1: a guy alone. Yeah, I just I'm just gonna go 931 00:57:10,280 --> 00:57:14,160 Speaker 1: ahead and dump that right into the trash ben. That's 932 00:57:14,200 --> 00:57:18,320 Speaker 1: my only comment on that. Okay, what about e harmony? 933 00:57:18,560 --> 00:57:21,280 Speaker 1: I mean, if you come home and Glenn closes in 934 00:57:21,320 --> 00:57:26,720 Speaker 1: your kitchen boiling your pet bunny, well, that's a threat. Yeah, 935 00:57:26,760 --> 00:57:29,920 Speaker 1: that's not even creepy. That's just a threat. Although I 936 00:57:29,960 --> 00:57:34,440 Speaker 1: will say in Fatal Attraction, the the scene where she 937 00:57:34,680 --> 00:57:37,920 Speaker 1: is sitting there clicking the light on and off, listening 938 00:57:37,960 --> 00:57:40,760 Speaker 1: to Mad and Butterfly, that was that was kind of creepy. 939 00:57:41,160 --> 00:57:45,040 Speaker 1: I was trying to think of like a creepy woman, 940 00:57:45,240 --> 00:57:47,920 Speaker 1: and I really couldn't come up with anybody. Well, these 941 00:57:47,960 --> 00:57:51,120 Speaker 1: are creepy behaviors, though, you know, not like Glenn cost 942 00:57:51,160 --> 00:57:53,400 Speaker 1: Close walked into the room and you're like, oh, I 943 00:57:53,440 --> 00:57:57,920 Speaker 1: don't know about that, right, right, right, there's a difference 944 00:57:58,600 --> 00:58:02,360 Speaker 1: right there. There's a difference between genuine creepiness and just 945 00:58:02,480 --> 00:58:05,600 Speaker 1: doing creepy things. I think it is much harder for 946 00:58:05,640 --> 00:58:08,000 Speaker 1: women to be creepy than men. Cannot think of a 947 00:58:08,120 --> 00:58:16,080 Speaker 1: single actual creepy woman I'd like to hear from people, though. Yeah. Uh, 948 00:58:16,240 --> 00:58:20,439 Speaker 1: e harmony. So we talked about Reddit. Now we're gonna 949 00:58:20,480 --> 00:58:23,480 Speaker 1: talk about e harmony. They had an article where they 950 00:58:23,560 --> 00:58:26,560 Speaker 1: wrote advice to dudes. It was called how to Avoid 951 00:58:26,600 --> 00:58:32,400 Speaker 1: the creep Zone, um, and their advice was for your 952 00:58:32,480 --> 00:58:35,640 Speaker 1: hobbies that you list to be just sort of vanilla, 953 00:58:36,400 --> 00:58:40,400 Speaker 1: don't like and even if you are an amateur taxidermist, 954 00:58:40,800 --> 00:58:44,520 Speaker 1: maybe don't put that down right. They said, it can 955 00:58:44,600 --> 00:58:46,800 Speaker 1: be attractive for a guy to have an off the 956 00:58:46,880 --> 00:58:50,440 Speaker 1: beaten path hobby, and one of the examples they gave 957 00:58:50,520 --> 00:58:54,080 Speaker 1: of an off the beaten path hobby was collecting punk records. 958 00:58:55,440 --> 00:58:58,600 Speaker 1: But don't get weirder than that. Yeah, and if you 959 00:58:58,680 --> 00:59:01,360 Speaker 1: know taxidermy in and of itself, some people might say 960 00:59:01,400 --> 00:59:03,360 Speaker 1: a super creepy. We did an episode on that. Other 961 00:59:03,440 --> 00:59:06,600 Speaker 1: people might say, no, it's just just beautiful artwork. But 962 00:59:07,280 --> 00:59:09,480 Speaker 1: Norman Bates was in a taxidermy for a reason. In 963 00:59:09,560 --> 00:59:14,440 Speaker 1: Psycho it was unsettling. Yeah, you know, yeah, and so 964 00:59:14,840 --> 00:59:20,080 Speaker 1: um there the Knox people who carried out this survey, 965 00:59:20,480 --> 00:59:24,160 Speaker 1: the Knox University researchers, they basically said, here's what we 966 00:59:24,280 --> 00:59:28,200 Speaker 1: think it is. Here's creepiness explained. And what they explained 967 00:59:28,360 --> 00:59:33,360 Speaker 1: was what can be called is the threat ambiguity um theory. Yeah, 968 00:59:33,440 --> 00:59:35,080 Speaker 1: this this one, I think we kind of put a 969 00:59:35,160 --> 00:59:37,240 Speaker 1: cherry on top on this one. Yeah, we really did 970 00:59:37,320 --> 00:59:40,200 Speaker 1: like it. It's just basically where you are creeped out 971 00:59:40,200 --> 00:59:43,280 Speaker 1: by something because your hackles are raised right then, and 972 00:59:44,040 --> 00:59:46,640 Speaker 1: it's because you haven't determined whether that things a threat 973 00:59:46,760 --> 00:59:51,360 Speaker 1: or not. Right. There's another one though that I subscribe to. 974 00:59:51,600 --> 00:59:55,560 Speaker 1: I think it is finally the unified theory of creepiness. 975 00:59:55,640 --> 00:59:58,880 Speaker 1: I think it covers everything. And it's called the category 976 00:59:59,000 --> 01:00:04,320 Speaker 1: ambiguity theory. Yeah. That was Now did David Livingstone Smith 977 01:00:04,680 --> 01:00:07,320 Speaker 1: make this up or was he just champion this? I 978 01:00:07,920 --> 01:00:10,880 Speaker 1: think he made it up because he wrote about the 979 01:00:11,040 --> 01:00:14,200 Speaker 1: Knox researchers and said, what they're talking about you can 980 01:00:14,280 --> 01:00:19,520 Speaker 1: call threat ambiguity category or threat ambiguity theory with category 981 01:00:19,560 --> 01:00:22,160 Speaker 1: ambiguity theory. He didn't cite anybody else, so it seemed 982 01:00:22,160 --> 01:00:25,080 Speaker 1: to be his own construct. Yeah, so this is the idea. 983 01:00:25,280 --> 01:00:28,240 Speaker 1: It's sort of like the threat ambiguity in that there 984 01:00:28,360 --> 01:00:32,320 Speaker 1: is some confusion, but it's not a threat like I 985 01:00:32,360 --> 01:00:34,400 Speaker 1: think this dude in the coffee shop is gonna kill me. 986 01:00:34,560 --> 01:00:38,439 Speaker 1: It's more like I don't know how to categorize that guy, 987 01:00:39,040 --> 01:00:42,200 Speaker 1: and that freaks me out. And it's based on what's 988 01:00:42,240 --> 01:00:45,880 Speaker 1: called um essential is um right, where if you are 989 01:00:46,000 --> 01:00:49,160 Speaker 1: a member of a species of animal, whether human or 990 01:00:49,320 --> 01:00:54,440 Speaker 1: raccoon or tiger, there's something about you where there's some 991 01:00:54,720 --> 01:00:58,680 Speaker 1: collection or set of things about you that that are 992 01:00:59,320 --> 01:01:04,640 Speaker 1: totally unique to your species. It's something you possess because 993 01:01:04,720 --> 01:01:08,000 Speaker 1: you remember that species, and because you remember that species, 994 01:01:08,280 --> 01:01:10,840 Speaker 1: you possess these things, and it can be very difficult 995 01:01:11,080 --> 01:01:12,840 Speaker 1: to put your finger on it, but it's just one 996 01:01:12,880 --> 01:01:15,000 Speaker 1: of those things that you know when you see it, 997 01:01:15,160 --> 01:01:17,400 Speaker 1: or no, when you don't see it right, and there 998 01:01:17,400 --> 01:01:21,200 Speaker 1: are clear borders between these things. You either have this 999 01:01:21,440 --> 01:01:23,800 Speaker 1: essence fully or you don't have it at all. You're 1000 01:01:23,840 --> 01:01:27,000 Speaker 1: lacking and you're missing it and something's really wrong. So 1001 01:01:27,280 --> 01:01:29,800 Speaker 1: in this article he used UM the example of a 1002 01:01:29,840 --> 01:01:32,760 Speaker 1: wax dummy. Yeah, have you ever been to like Madame 1003 01:01:32,800 --> 01:01:36,360 Speaker 1: Tusso's um, I find that the ones and again with 1004 01:01:36,440 --> 01:01:38,920 Speaker 1: the eyes, the ones that work the best, or the 1005 01:01:39,000 --> 01:01:43,120 Speaker 1: ones where they have sunglasses on. Oh yeah again Michael Jackson, 1006 01:01:43,680 --> 01:01:46,240 Speaker 1: that's right. But the whole point with these wax dummies 1007 01:01:46,400 --> 01:01:49,840 Speaker 1: with the eyes is they're fixed. They're not moving around. 1008 01:01:50,160 --> 01:01:53,920 Speaker 1: The facial expression is locked in. Um, the skin itself. 1009 01:01:54,040 --> 01:01:56,720 Speaker 1: You know, you can only do so much. And Madame 1010 01:01:56,840 --> 01:01:59,920 Speaker 1: Tussos and museums like that are the best of the best, 1011 01:02:00,200 --> 01:02:02,560 Speaker 1: and they do look pretty good. But that's the whole 1012 01:02:02,600 --> 01:02:06,800 Speaker 1: point with the Uncanny Valley is you can't get there 1013 01:02:07,560 --> 01:02:10,120 Speaker 1: and say we're fine. It's that one per cent that 1014 01:02:10,280 --> 01:02:15,200 Speaker 1: still gives people the creeps exactly. And that's and it 1015 01:02:15,320 --> 01:02:18,400 Speaker 1: sums up everything, like the threat ambiguity could fall into this, 1016 01:02:18,840 --> 01:02:21,400 Speaker 1: whether you're talking about robots, whether you're talking about a 1017 01:02:21,960 --> 01:02:26,040 Speaker 1: half dog half lizard combo. Which living stone sites are 1018 01:02:26,080 --> 01:02:30,560 Speaker 1: living stones smith sites? Yeah, a dessert would be creepy 1019 01:02:30,600 --> 01:02:33,760 Speaker 1: when you saw it, But so things that are a 1020 01:02:33,840 --> 01:02:37,640 Speaker 1: threat are creepy. But there's also things that are creepy 1021 01:02:37,720 --> 01:02:40,840 Speaker 1: that aren't a threat. And this category ambiguity theory figured 1022 01:02:40,880 --> 01:02:45,080 Speaker 1: it out. So if that's true, Chuck and David Livingstone 1023 01:02:45,120 --> 01:02:47,920 Speaker 1: Smith figured out what is the basis of creepiness. We 1024 01:02:48,240 --> 01:02:54,560 Speaker 1: finally have the independent variable licked and massa hiro Morty's 1025 01:02:54,800 --> 01:02:57,920 Speaker 1: Uncanny Valley graph and we can get to work. Is 1026 01:02:58,000 --> 01:03:01,680 Speaker 1: he still around? Yeah? Yes, when that's happy about all this? 1027 01:03:03,440 --> 01:03:06,880 Speaker 1: I get the impression that he's kind of like just whatever, 1028 01:03:07,240 --> 01:03:10,400 Speaker 1: gone off on his own little thing, and he's fine. 1029 01:03:10,920 --> 01:03:13,560 Speaker 1: He wrote it in v after all, you know, I 1030 01:03:13,600 --> 01:03:17,120 Speaker 1: mean most fifty years ago, so he's probably up there. 1031 01:03:19,080 --> 01:03:22,160 Speaker 1: Uh you anything else? I got nothing else? Good ones? Yeah, 1032 01:03:22,520 --> 01:03:24,439 Speaker 1: if you want to know more about the Uncanny Valley, 1033 01:03:24,560 --> 01:03:27,320 Speaker 1: we should say this was based originally on a grabsto article. 1034 01:03:28,600 --> 01:03:30,560 Speaker 1: But if you want to know more about the Uncanny Valley, 1035 01:03:30,600 --> 01:03:33,280 Speaker 1: come read that grab Sto article. You can type Uncanny 1036 01:03:33,360 --> 01:03:35,439 Speaker 1: Valley and the search bar at how stuff works dot com. 1037 01:03:35,840 --> 01:03:41,080 Speaker 1: And since I said search bars, time for listener mail. Well, 1038 01:03:41,160 --> 01:03:43,680 Speaker 1: and today it's a very special listener mail. This is 1039 01:03:44,200 --> 01:03:46,880 Speaker 1: Josh edition because you picked out a very special one. 1040 01:03:47,200 --> 01:03:49,360 Speaker 1: I love this one. I'm gonna butcher the dude's name, 1041 01:03:49,440 --> 01:03:51,600 Speaker 1: but that's right, take it away. It's a good one. Okay, 1042 01:03:52,440 --> 01:03:57,160 Speaker 1: I'm gonna call this one email from a real Irish historian, 1043 01:03:58,920 --> 01:04:01,360 Speaker 1: and it feels pretty good to come out of a job. 1044 01:04:01,560 --> 01:04:04,840 Speaker 1: Yeah maybe, Okay, Hi guys, I'm a big fan of 1045 01:04:04,880 --> 01:04:07,480 Speaker 1: the show. It's informative and insightful, and I find myself 1046 01:04:07,560 --> 01:04:10,560 Speaker 1: interested in things that I never looked twice at before. 1047 01:04:11,120 --> 01:04:14,160 Speaker 1: One subject that I'd always found fascinating was the correlation 1048 01:04:14,240 --> 01:04:17,840 Speaker 1: between the Native American Choctaw tribe and the people of Ireland. 1049 01:04:18,000 --> 01:04:20,360 Speaker 1: I didn't realize that was a thing that you at all. 1050 01:04:20,800 --> 01:04:23,040 Speaker 1: This is a story which isn't well known, okay, which 1051 01:04:23,120 --> 01:04:25,680 Speaker 1: isn't well known outside of some areas of Ireland and 1052 01:04:25,800 --> 01:04:27,720 Speaker 1: of course within the tribe. But it's a really good 1053 01:04:27,800 --> 01:04:30,640 Speaker 1: story of solidarity between two groups of people, despite being 1054 01:04:30,720 --> 01:04:33,560 Speaker 1: thousands of miles apart. Less than twenty years after the 1055 01:04:33,600 --> 01:04:37,120 Speaker 1: Trail of Tears, which forcibly displaced thousands of natives, the 1056 01:04:37,240 --> 01:04:39,960 Speaker 1: Great Famine hit Ireland. During this time, as you know, 1057 01:04:40,120 --> 01:04:42,560 Speaker 1: Ireland was colonized by the British and the people of 1058 01:04:42,640 --> 01:04:45,640 Speaker 1: Ireland were treated poorly due to the common misconception that 1059 01:04:45,720 --> 01:04:49,560 Speaker 1: Irish Catholics were lower caliber of human uh. He goes 1060 01:04:49,640 --> 01:04:51,880 Speaker 1: on to to give more examples, but just suffice to 1061 01:04:51,960 --> 01:04:54,880 Speaker 1: say it was not good for the Irish people. During 1062 01:04:54,920 --> 01:04:59,040 Speaker 1: the famine, words spread to America and to the Choctaw tribe. 1063 01:04:59,400 --> 01:05:01,960 Speaker 1: They sympathy eased with the Irish people so much that 1064 01:05:02,120 --> 01:05:05,240 Speaker 1: only fifteen years after the Trail of Tears, they donated 1065 01:05:05,320 --> 01:05:08,960 Speaker 1: seven hundred and ten dollars during eighteen forty five to 1066 01:05:09,120 --> 01:05:12,200 Speaker 1: send to Ireland as part of a relief fund. This 1067 01:05:12,400 --> 01:05:15,040 Speaker 1: is estimated to be roughly sixty eight thousand dollars in 1068 01:05:15,080 --> 01:05:18,439 Speaker 1: today's money. This was greatly appreciated by the Irish people 1069 01:05:18,480 --> 01:05:21,360 Speaker 1: and after the famine the bond continued. In Cork, we 1070 01:05:21,440 --> 01:05:24,560 Speaker 1: have a sculpture honoring the tribute of the Choctaw people, 1071 01:05:24,920 --> 01:05:27,200 Speaker 1: and in nineteen ninety members of the tribe came to 1072 01:05:27,280 --> 01:05:30,560 Speaker 1: Ireland and walked the Famine Walk in Mayo to replicate 1073 01:05:30,640 --> 01:05:33,160 Speaker 1: the walk that starving people made to ask the landlord 1074 01:05:33,280 --> 01:05:37,240 Speaker 1: for help. In nine in Irish Commemoration group walked from 1075 01:05:37,280 --> 01:05:40,840 Speaker 1: Oklahoma to Mission to replicate the Trail of Tears and 1076 01:05:41,120 --> 01:05:45,120 Speaker 1: raised seven hundred thousand dollars to help poverty in Africa. 1077 01:05:45,680 --> 01:05:47,720 Speaker 1: These two groups continue to work together and to this 1078 01:05:47,880 --> 01:05:50,400 Speaker 1: day our President has declared an honorary member of the 1079 01:05:50,480 --> 01:05:54,000 Speaker 1: Choctaw tribe. Along with the Quakers, who fed Irish people 1080 01:05:54,040 --> 01:05:56,840 Speaker 1: to the point that their members ended starving themselves. The 1081 01:05:56,960 --> 01:06:00,080 Speaker 1: Choctaw tribe remained some of the unsung heroes of the 1082 01:06:00,120 --> 01:06:02,720 Speaker 1: famine story of Ireland. Sorry it went on so long. 1083 01:06:02,800 --> 01:06:05,240 Speaker 1: I'm an Irish historian, so I tend to waffle. Love 1084 01:06:05,320 --> 01:06:09,160 Speaker 1: the show. Best of luck with yourselves, Uh, Roison kill 1085 01:06:09,280 --> 01:06:14,680 Speaker 1: Roy fan, fantastic, great story. Thanks a lot, Roison. I'm 1086 01:06:14,800 --> 01:06:17,200 Speaker 1: quite sure that's not the actual pronunciation of your name, 1087 01:06:17,240 --> 01:06:19,760 Speaker 1: because there's a lot of accent marks over letters there 1088 01:06:19,880 --> 01:06:23,560 Speaker 1: normally aren't. Um, So I apologized for that, but I 1089 01:06:24,080 --> 01:06:26,600 Speaker 1: nailed your last name. I'm positive of it. And Josh 1090 01:06:26,680 --> 01:06:29,200 Speaker 1: Clark three and a half stars, not bad out of 1091 01:06:29,240 --> 01:06:32,160 Speaker 1: three and a half? Right, remember what was star search 1092 01:06:32,440 --> 01:06:35,560 Speaker 1: the four stars? Oh? I don't remember. I just just 1093 01:06:35,800 --> 01:06:40,200 Speaker 1: now remember there was such thing as Starson. Uh. Well, okay, 1094 01:06:40,240 --> 01:06:42,400 Speaker 1: well you take the in part, Chuck, since I took 1095 01:06:42,440 --> 01:06:47,800 Speaker 1: listener mail. Oh geez uh, thanks for listening. Um Hey, 1096 01:06:47,880 --> 01:06:49,040 Speaker 1: if you want to get in touch with us, you 1097 01:06:49,120 --> 01:06:53,840 Speaker 1: can find Josh at josh um Clark on Twitter and 1098 01:06:54,120 --> 01:06:57,800 Speaker 1: me uh in Facebook at Charles W. Chuck Bryant, or 1099 01:06:57,880 --> 01:07:01,240 Speaker 1: you can go to our official pages. Stuff you should know, podcasts, 1100 01:07:01,440 --> 01:07:04,120 Speaker 1: what else? Uh, let's see if they want to send 1101 01:07:04,160 --> 01:07:06,960 Speaker 1: us an email. Oh yeah, email us at stuff podcast 1102 01:07:07,040 --> 01:07:10,640 Speaker 1: at how stuff works dot com and have a good day. 1103 01:07:10,920 --> 01:07:13,440 Speaker 1: Is that what you say? That's good enough? All right? 1104 01:07:18,800 --> 01:07:21,240 Speaker 1: For more on this and thousands of other topics, is 1105 01:07:21,280 --> 01:07:22,480 Speaker 1: it how stuff works dot com.