1 00:00:03,080 --> 00:00:05,920 Speaker 1: Welcome to Stuff to Blow your Mind from how Stuff 2 00:00:05,960 --> 00:00:14,560 Speaker 1: Works dot com. Hey, welcome to Stuff to Blow your Mind. 3 00:00:14,600 --> 00:00:17,800 Speaker 1: My name is Robert Lamb and I'm Joe McCormick and Robert. 4 00:00:17,960 --> 00:00:20,639 Speaker 1: I want to take you back to a conversation we had. 5 00:00:21,040 --> 00:00:23,360 Speaker 1: I think it was last December. He was right after 6 00:00:23,520 --> 00:00:26,800 Speaker 1: I went to see the new, the most recent Star 7 00:00:26,800 --> 00:00:29,680 Speaker 1: Wars movie, Rogue one. Oh yes, uh, And I am 8 00:00:29,720 --> 00:00:32,040 Speaker 1: on the cusp, the very cusp of seeing it myself 9 00:00:32,040 --> 00:00:35,000 Speaker 1: and waiting for it to become rent rental options. So, 10 00:00:35,200 --> 00:00:37,480 Speaker 1: oh it's not yet. No, still got still a week 11 00:00:37,520 --> 00:00:40,320 Speaker 1: or two out. So has everything been spoiled for you 12 00:00:40,400 --> 00:00:43,640 Speaker 1: so far? No? People have been a little um cooler 13 00:00:43,680 --> 00:00:45,800 Speaker 1: on this one. Um. I think some things are probably 14 00:00:45,800 --> 00:00:47,960 Speaker 1: spoiled for me, but not like that last one where 15 00:00:47,960 --> 00:00:51,000 Speaker 1: everybody just really felt the need to, you know, just 16 00:00:51,120 --> 00:00:53,239 Speaker 1: lay it all out on social media. Let me let 17 00:00:53,280 --> 00:00:55,200 Speaker 1: me spoil one thing for you here and they go 18 00:00:55,280 --> 00:00:59,280 Speaker 1: to space and there's a war there, ye, stars among 19 00:00:59,360 --> 00:01:01,720 Speaker 1: the stars. But there is one thing in the movie. Okay, 20 00:01:01,720 --> 00:01:04,080 Speaker 1: so mild spoiler for Rogue one coming out. It's something 21 00:01:04,080 --> 00:01:06,720 Speaker 1: that probably everybody already knows. It's also not really about 22 00:01:06,720 --> 00:01:09,240 Speaker 1: the content of the movie, but just about which characters 23 00:01:09,240 --> 00:01:12,400 Speaker 1: you see. But if you are ready, are you ready 24 00:01:12,400 --> 00:01:14,920 Speaker 1: for the smild spoiler? Okay, we get to see if 25 00:01:14,920 --> 00:01:17,679 Speaker 1: you remember back to the original Star Wars back go 26 00:01:17,800 --> 00:01:21,319 Speaker 1: back to the seventies, Peter Cushing as Grand mof Tarkin, 27 00:01:21,440 --> 00:01:24,440 Speaker 1: the guy who was in fact Darth Vader's boss on 28 00:01:24,480 --> 00:01:29,520 Speaker 1: the Star Him. Yeah, he's enough of this, Vada release him. 29 00:01:29,560 --> 00:01:32,080 Speaker 1: And we love Peter Cushing because he was in all 30 00:01:32,080 --> 00:01:34,200 Speaker 1: these old monster movies. He he goes back to the 31 00:01:34,200 --> 00:01:37,319 Speaker 1: Hammer movies. He was. He was Dr Frankenstein, like the 32 00:01:37,440 --> 00:01:40,120 Speaker 1: villainous Dr Frankenstein of the Hammer films, and he I 33 00:01:40,160 --> 00:01:42,759 Speaker 1: think was the hero of the version of the Mummy 34 00:01:42,880 --> 00:01:46,399 Speaker 1: that has Christopher Lee as as the Mummy. Uh, the one. 35 00:01:46,440 --> 00:01:48,360 Speaker 1: I've got the poster for it in the house, except 36 00:01:48,360 --> 00:01:51,840 Speaker 1: it's the Belgian posters, so it's La Malediction de Feron. 37 00:01:53,160 --> 00:01:56,720 Speaker 1: But yeah, so Peter Cushing was the original Grand mof Tarkin, 38 00:01:56,880 --> 00:02:00,200 Speaker 1: this bad Empire guy who was Darth Vader's ball us. 39 00:02:00,800 --> 00:02:03,240 Speaker 1: And the thing they do in the New Rogue One 40 00:02:03,600 --> 00:02:07,080 Speaker 1: is they bring him back. He's dead, he has passed away. 41 00:02:07,600 --> 00:02:10,560 Speaker 1: This movie takes place a little bit before the original 42 00:02:10,600 --> 00:02:13,600 Speaker 1: Star Wars is supposed to have taken place, but they 43 00:02:14,120 --> 00:02:17,160 Speaker 1: bring this character back and they have an actor stand 44 00:02:17,160 --> 00:02:19,680 Speaker 1: in as him, but it's not just a recast role. 45 00:02:19,960 --> 00:02:22,280 Speaker 1: They try to make it look as if this is 46 00:02:22,400 --> 00:02:26,200 Speaker 1: Peter Cushing standing here delivering the lines with c g I. 47 00:02:26,520 --> 00:02:28,480 Speaker 1: And this is an odd choice because, all right, so 48 00:02:29,600 --> 00:02:32,160 Speaker 1: you're gonna have Darth Vader in there, that's easy to do. 49 00:02:32,280 --> 00:02:34,680 Speaker 1: Darth Vader is a dude in a suit, voiced by 50 00:02:34,760 --> 00:02:37,880 Speaker 1: James Earl Jones. James Earl Jones is still alive, so 51 00:02:37,960 --> 00:02:40,720 Speaker 1: you can check that one off the list. But Grandma Tarkin, 52 00:02:40,919 --> 00:02:43,720 Speaker 1: like you said, uh that the actor is dead. So 53 00:02:43,800 --> 00:02:45,640 Speaker 1: it seems to me like the first easiest thing to 54 00:02:45,680 --> 00:02:47,440 Speaker 1: do is just don't have those scenes. If you know 55 00:02:47,480 --> 00:02:51,120 Speaker 1: it's gonna be problematic, don't even mess with it, or um, 56 00:02:51,639 --> 00:02:54,919 Speaker 1: just use an actual living actor such as Wayne pie 57 00:02:55,000 --> 00:02:58,320 Speaker 1: Graham who played him in Revenge of the Sith. Or 58 00:02:58,480 --> 00:03:00,720 Speaker 1: go with Ben Cross, who's another actor that I've seen 59 00:03:00,720 --> 00:03:03,680 Speaker 1: over the years brought up as potential Tarkan casting, or 60 00:03:03,880 --> 00:03:06,280 Speaker 1: head go with Ralph Finds, Like, clearly you have the 61 00:03:06,360 --> 00:03:09,480 Speaker 1: money to throw down the well of expensive c g 62 00:03:09,600 --> 00:03:11,920 Speaker 1: I equipment. Just go ahead and hire Ralph Finds. I 63 00:03:11,960 --> 00:03:15,600 Speaker 1: know he's pricy, but he's great and the consummate evil Brett. Yeah, 64 00:03:15,600 --> 00:03:17,920 Speaker 1: he even kind of looks like a younger Peter Cushing. 65 00:03:18,280 --> 00:03:20,680 Speaker 1: He's got that same kind of angular face, like the 66 00:03:20,720 --> 00:03:24,000 Speaker 1: thin long face with the jaw and the scowl. It's 67 00:03:24,040 --> 00:03:27,800 Speaker 1: all there. And it's not like fans of various franchise 68 00:03:27,960 --> 00:03:30,760 Speaker 1: are not clearly cool with recasting. It's not like we're 69 00:03:30,800 --> 00:03:34,280 Speaker 1: gonna be thrown into, you know, a traumatic spin, because 70 00:03:34,320 --> 00:03:37,520 Speaker 1: you can look to a Game of Thrones, James Bond, Twilight, 71 00:03:37,600 --> 00:03:40,680 Speaker 1: Harry Potter, etcetera, like we we get it. We can 72 00:03:40,800 --> 00:03:43,640 Speaker 1: roll with a recast. Now. I want to go into 73 00:03:43,680 --> 00:03:47,680 Speaker 1: completely different directions thinking about this c g I grandmof Tarken. 74 00:03:48,480 --> 00:03:52,560 Speaker 1: One is that I didn't like it in the movie. Okay, 75 00:03:52,600 --> 00:03:55,320 Speaker 1: I saw it and I was just like, don't want this. 76 00:03:55,640 --> 00:03:57,920 Speaker 1: It pulled me out of the movie. It made me 77 00:03:58,000 --> 00:04:01,240 Speaker 1: stop being in the story and just thinking about how 78 00:04:01,240 --> 00:04:03,800 Speaker 1: did they do that? I don't On one hand, it 79 00:04:03,840 --> 00:04:07,240 Speaker 1: looked great, Like when you see the movie, I think 80 00:04:07,240 --> 00:04:10,280 Speaker 1: you will kind of have to agree. It's unless I'm 81 00:04:10,280 --> 00:04:14,160 Speaker 1: missing something. It's the best c g I simulation of 82 00:04:14,200 --> 00:04:17,760 Speaker 1: a real person that I've ever seen, Like, it looks amazing, 83 00:04:18,240 --> 00:04:21,720 Speaker 1: but it still looks not quite good enough that I 84 00:04:21,760 --> 00:04:24,159 Speaker 1: can just accept it and go with it. I kept 85 00:04:24,200 --> 00:04:28,240 Speaker 1: continually thinking, like, what am I looking at? It's almost 86 00:04:28,320 --> 00:04:31,320 Speaker 1: really him, but it's not quite really him, and it 87 00:04:31,400 --> 00:04:35,039 Speaker 1: made me feel icky. So in this it made you 88 00:04:35,080 --> 00:04:37,880 Speaker 1: descend into what we've come to know as a as 89 00:04:37,920 --> 00:04:41,120 Speaker 1: a as a species as the Uncanny Valley. Right, So 90 00:04:41,240 --> 00:04:43,240 Speaker 1: today is going to be the first of two episodes 91 00:04:43,279 --> 00:04:45,359 Speaker 1: we want to do about the Uncanny Valley, And this 92 00:04:45,400 --> 00:04:47,919 Speaker 1: first one we wanted descend into the Uncanny Valley, but 93 00:04:47,920 --> 00:04:50,520 Speaker 1: not just talk about it in terms of the standard 94 00:04:50,520 --> 00:04:53,039 Speaker 1: pop culture phenomenon, because this is one of those side 95 00:04:53,040 --> 00:04:56,200 Speaker 1: tech concepts that is totally filtered down into the mainstream. 96 00:04:56,279 --> 00:05:00,280 Speaker 1: Everybody talks about the Uncanny Valley. It's a totally norm mole, 97 00:05:00,720 --> 00:05:04,400 Speaker 1: ground level pop culture phenomenon now, especially with as much 98 00:05:04,480 --> 00:05:06,360 Speaker 1: bad C g I as we encounter in the movie. 99 00:05:06,839 --> 00:05:09,479 Speaker 1: But there it's also a scientific field of study. It's 100 00:05:09,480 --> 00:05:12,279 Speaker 1: something that people are looking into with empirical research to 101 00:05:12,320 --> 00:05:14,960 Speaker 1: try to figure out does it really exist, If it 102 00:05:15,000 --> 00:05:17,800 Speaker 1: does really exist, what causes it, what can be done 103 00:05:17,839 --> 00:05:19,680 Speaker 1: about it? So we want to look at it from 104 00:05:19,680 --> 00:05:22,719 Speaker 1: both of these angles today, right, So we should probably 105 00:05:22,800 --> 00:05:25,679 Speaker 1: roll through some just fun examples of this. We're gonna 106 00:05:25,680 --> 00:05:27,600 Speaker 1: try and not to go too long on this. If 107 00:05:27,640 --> 00:05:30,000 Speaker 1: we do, we'll cut it and save it for trailer talk. 108 00:05:30,040 --> 00:05:32,960 Speaker 1: Either way, we'll probably do a Facebook live trailer talk 109 00:05:33,440 --> 00:05:37,040 Speaker 1: on an upcoming Friday about some of these movies. Okay, 110 00:05:37,080 --> 00:05:38,880 Speaker 1: So I want to go back to a much earlier 111 00:05:38,960 --> 00:05:42,480 Speaker 1: experience for me, Robert, did you see The Mummy Returns 112 00:05:42,560 --> 00:05:45,120 Speaker 1: in two thousand one? Remember this one? I don't think 113 00:05:45,160 --> 00:05:47,480 Speaker 1: I saw The Mummy Returns. I saw the Money and 114 00:05:47,560 --> 00:05:50,640 Speaker 1: I remember digging it at the time, but not not 115 00:05:50,760 --> 00:05:52,880 Speaker 1: the old hammer one or the Universal one. No, no, 116 00:05:53,040 --> 00:05:55,479 Speaker 1: the yeah, the the the re the reboot of the Money. 117 00:05:55,560 --> 00:05:59,440 Speaker 1: Is his name, Brendan Frasier. Yeah, and what Arnold Voslo? 118 00:05:59,600 --> 00:06:02,120 Speaker 1: Yeah he was he. I I enjoyed him as they 119 00:06:02,320 --> 00:06:04,120 Speaker 1: kind of brought in some of these aspects of the 120 00:06:04,160 --> 00:06:07,839 Speaker 1: tragic Mummy figure, which I liked. Yeah, yeah, yeah, okay. 121 00:06:07,839 --> 00:06:10,480 Speaker 1: But in two thousand one we got the Scorpion King. 122 00:06:10,600 --> 00:06:12,960 Speaker 1: This is a character that appears in the Mummy Returns 123 00:06:13,160 --> 00:06:14,640 Speaker 1: and he's pretty much if you want to picture it, 124 00:06:14,640 --> 00:06:16,520 Speaker 1: if you haven't seen the movie. Actually you should look 125 00:06:16,600 --> 00:06:18,640 Speaker 1: up video of this. We're gonna tell you to go 126 00:06:18,720 --> 00:06:20,760 Speaker 1: look up images and video quite a few times in 127 00:06:20,800 --> 00:06:24,040 Speaker 1: these episodes because some visual aids will help. But if 128 00:06:24,040 --> 00:06:26,840 Speaker 1: you want to picture it, picture the concept of a centaur, 129 00:06:27,400 --> 00:06:31,000 Speaker 1: except replace the horse parts with scorpion parts and some 130 00:06:31,080 --> 00:06:34,600 Speaker 1: other random arthropod bits, and the man part on top 131 00:06:34,680 --> 00:06:38,919 Speaker 1: is Dwayne the Rock Johnson. Except it's not Dwayne the 132 00:06:39,000 --> 00:06:41,400 Speaker 1: Rock Johnson. There's a there's a bit of a problem 133 00:06:41,440 --> 00:06:44,480 Speaker 1: with the rocks. So the corpion Corpion, the Scorpion King 134 00:06:44,839 --> 00:06:49,360 Speaker 1: scuttles into action in the film, and you can tell 135 00:06:49,440 --> 00:06:52,520 Speaker 1: immediately something is wrong because it's not just the rock. 136 00:06:52,600 --> 00:06:56,080 Speaker 1: It's this c g I upper body designed to look 137 00:06:56,160 --> 00:06:58,640 Speaker 1: like the rock. It's supposed to be him, but it 138 00:06:58,680 --> 00:07:01,679 Speaker 1: doesn't look right. It looks like somebody took the rock, 139 00:07:02,480 --> 00:07:05,680 Speaker 1: skinned him, and then took the skin suit and then 140 00:07:05,800 --> 00:07:09,039 Speaker 1: boiled it and then maybe ironed it and rubbed it 141 00:07:09,080 --> 00:07:12,200 Speaker 1: down with wax, and then stretched it over somebody else, 142 00:07:12,240 --> 00:07:16,800 Speaker 1: like a bald Crispin Glover wearing a waxed up the 143 00:07:16,920 --> 00:07:20,520 Speaker 1: rock suit. And and that would be fine if that's 144 00:07:20,560 --> 00:07:22,840 Speaker 1: what they were going for. But I guess the disconnect 145 00:07:22,880 --> 00:07:25,480 Speaker 1: here is that clearly they wanted this to be like 146 00:07:25,560 --> 00:07:29,400 Speaker 1: the Rock as a scorpion center and not this um, 147 00:07:29,400 --> 00:07:32,640 Speaker 1: creepy in human above the nation that you've described, right, 148 00:07:32,720 --> 00:07:35,000 Speaker 1: And it's sort of I guess works because it's okay 149 00:07:35,000 --> 00:07:37,280 Speaker 1: if he's creepy, because he's a monster. But he was 150 00:07:37,320 --> 00:07:39,920 Speaker 1: creepy in a way that he clearly wasn't supposed to be. 151 00:07:40,440 --> 00:07:42,800 Speaker 1: It wasn't just that, oh he's a monster. He looks weird. 152 00:07:42,960 --> 00:07:45,400 Speaker 1: Something looked wrong with him. And this was at a 153 00:07:45,440 --> 00:07:49,160 Speaker 1: time when computer generated animations were hot, right two thousand one. 154 00:07:49,400 --> 00:07:51,840 Speaker 1: They seemed to be getting better all the time, and 155 00:07:51,920 --> 00:07:55,040 Speaker 1: yet they were terrible, producing these characters that were not 156 00:07:55,080 --> 00:07:59,560 Speaker 1: only not convincingly human, they were literally physically unpleasant to 157 00:07:59,600 --> 00:08:02,320 Speaker 1: look at. They were repulsive. Yeah, it was a period 158 00:08:02,320 --> 00:08:05,360 Speaker 1: when everyone was just foolishly optimistic about what we could 159 00:08:05,360 --> 00:08:07,880 Speaker 1: achieve with c g I, and you know, into in 160 00:08:08,040 --> 00:08:11,080 Speaker 1: a sense, maybe that hasn't gone away. We're still very 161 00:08:11,400 --> 00:08:14,840 Speaker 1: the Rogue one example, Like, clearly everyone is very optimistic 162 00:08:14,880 --> 00:08:18,120 Speaker 1: about how great this looked, and even though to your point, 163 00:08:18,160 --> 00:08:21,600 Speaker 1: it does look great, but within the context of the film, 164 00:08:21,880 --> 00:08:24,720 Speaker 1: something doesn't quite work. Yeah, I would say, Now for 165 00:08:24,800 --> 00:08:26,720 Speaker 1: some people, we can get into this more. I think, 166 00:08:26,800 --> 00:08:29,680 Speaker 1: especially in the next episode when we then the next one, 167 00:08:29,720 --> 00:08:32,560 Speaker 1: we want to try to go beyond the valley, the 168 00:08:32,640 --> 00:08:36,320 Speaker 1: Uncanny Valley. But I will say at this point, for 169 00:08:36,440 --> 00:08:40,080 Speaker 1: some people, Tarkan was not over the line or under 170 00:08:40,120 --> 00:08:41,720 Speaker 1: the line. I don't know where you put the line. 171 00:08:41,840 --> 00:08:44,440 Speaker 1: But for for some people it worked, and I do 172 00:08:44,520 --> 00:08:46,920 Speaker 1: think that's an interesting thing to acknowledge that while for 173 00:08:47,000 --> 00:08:50,880 Speaker 1: me I experienced this, uh, not to the same extent 174 00:08:50,960 --> 00:08:53,880 Speaker 1: as the Scorpion King, but a kind of Scorpion King revulsion, 175 00:08:54,000 --> 00:08:56,640 Speaker 1: not everybody did. Now, one thing about Tarkan is that 176 00:08:56,840 --> 00:09:00,520 Speaker 1: the Tarkan c Jack character is correct me if I'm wrong, 177 00:09:00,520 --> 00:09:02,640 Speaker 1: because I have not seen it myself yet. But he 178 00:09:02,720 --> 00:09:06,960 Speaker 1: is interacting with human actors in this in his scenes, yeah, 179 00:09:07,000 --> 00:09:09,400 Speaker 1: I think so, or at least he's in a film 180 00:09:09,440 --> 00:09:11,960 Speaker 1: with other human actors, even if he's not sharing the 181 00:09:11,960 --> 00:09:15,560 Speaker 1: exact same scene with them. So you might think, well, 182 00:09:15,600 --> 00:09:17,880 Speaker 1: if you just had a movie just full of brilliant 183 00:09:17,920 --> 00:09:20,680 Speaker 1: looking Tarkans, maybe it would be okay. And maybe it would. 184 00:09:20,960 --> 00:09:25,560 Speaker 1: But some of the classic examples of Uncanny Valley happened 185 00:09:25,600 --> 00:09:27,880 Speaker 1: to be films that are filled with nothing but c 186 00:09:28,000 --> 00:09:30,000 Speaker 1: g I characters. Yeah, how about one from the same 187 00:09:30,080 --> 00:09:32,720 Speaker 1: year as The mom of Your Turns two thousand one, 188 00:09:32,800 --> 00:09:35,640 Speaker 1: If you go back to Oh Yes, Final Fantasy, The 189 00:09:35,679 --> 00:09:41,520 Speaker 1: Spirits Within, I remember kind of liking it. I do too. 190 00:09:41,600 --> 00:09:43,240 Speaker 1: It was a film that I think I kind of 191 00:09:43,840 --> 00:09:47,959 Speaker 1: half watched, half worked on, like some just college coursework 192 00:09:48,040 --> 00:09:50,760 Speaker 1: or something just on in the background. And and maybe 193 00:09:50,760 --> 00:09:52,520 Speaker 1: that was the right level of immersion in it. But 194 00:09:52,559 --> 00:09:55,080 Speaker 1: I remember digging it. But at the same time, there 195 00:09:55,120 --> 00:09:57,720 Speaker 1: are a lot of dead puppet eyes in this movie. 196 00:09:57,840 --> 00:10:00,719 Speaker 1: Oh yeah, and it's so I saw it at the time. 197 00:10:00,720 --> 00:10:03,520 Speaker 1: I remember having mixed feelings about the animation, like in 198 00:10:03,600 --> 00:10:06,079 Speaker 1: some senses, I remember thinking, Wow, that looks so cool. 199 00:10:06,880 --> 00:10:09,160 Speaker 1: That again may have been a product of its time. 200 00:10:09,240 --> 00:10:12,040 Speaker 1: We can talk about that more, how our expectations change 201 00:10:12,120 --> 00:10:16,240 Speaker 1: as things go on. But also I don't know. There 202 00:10:16,240 --> 00:10:18,320 Speaker 1: were multiple things wrong with that movie, one of which 203 00:10:18,360 --> 00:10:21,720 Speaker 1: being that the last line of spoken dialogue in the movie, 204 00:10:21,760 --> 00:10:23,600 Speaker 1: as a friend of mine pointed out at the time, 205 00:10:23,760 --> 00:10:29,960 Speaker 1: was oh it's warm. Well I could I don't remember 206 00:10:29,960 --> 00:10:31,960 Speaker 1: the line, so I can't speak to it how well 207 00:10:31,960 --> 00:10:36,320 Speaker 1: it landed, But I can see that being a problem credits. Now, 208 00:10:36,440 --> 00:10:39,720 Speaker 1: another big one, this came out just three years later, is, 209 00:10:39,840 --> 00:10:42,839 Speaker 1: of course, The Polar Express. Now, when people talk about 210 00:10:42,840 --> 00:10:45,040 Speaker 1: the Uncanny Valley these days, I'd say this is a 211 00:10:45,120 --> 00:10:47,640 Speaker 1: top three mentioned. Yeah, this is one of the defining 212 00:10:47,760 --> 00:10:51,120 Speaker 1: nightmares of our time now based on a wonderful children's 213 00:10:51,120 --> 00:10:53,800 Speaker 1: book about the magic of Christmas time. Yeah, the book 214 00:10:53,840 --> 00:10:56,520 Speaker 1: is wonderful, but it's certainly one of these examples. If 215 00:10:56,559 --> 00:10:58,800 Speaker 1: you take a very brief children's book and you try 216 00:10:58,840 --> 00:11:01,760 Speaker 1: and adapt it into a a feature length motion picture, 217 00:11:02,440 --> 00:11:05,880 Speaker 1: that's very difficult to do. In fact, I'm really grasping 218 00:11:05,920 --> 00:11:08,760 Speaker 1: for an example where anybody actually pulled it off. Like, 219 00:11:08,800 --> 00:11:11,640 Speaker 1: the best adaptations of children's books that come to mind 220 00:11:11,920 --> 00:11:16,320 Speaker 1: are all very short, uh, very short films generally. I'm 221 00:11:16,400 --> 00:11:20,559 Speaker 1: thinking of Dr Seus's adaptations from the seventies and eighties, 222 00:11:21,360 --> 00:11:25,120 Speaker 1: not The Polar Express, which is just an exercise in 223 00:11:25,160 --> 00:11:31,400 Speaker 1: psychic trauma brought on by just seemingly intentionally weaponized Uncanny Valley. Um, 224 00:11:31,520 --> 00:11:35,160 Speaker 1: you know, the soulless puppet people. I've never seen this movie, 225 00:11:35,160 --> 00:11:37,079 Speaker 1: but I looked up clips to see what people were 226 00:11:37,080 --> 00:11:41,360 Speaker 1: talking about, and oh man, they they are not kidding it. 227 00:11:41,960 --> 00:11:44,320 Speaker 1: I don't know how children made it through this movie. 228 00:11:44,559 --> 00:11:48,000 Speaker 1: It has these it has these creepy elves, It's got 229 00:11:48,000 --> 00:11:52,360 Speaker 1: a creepy Tom Hanks as a train conductor. Nothing seems right, 230 00:11:52,440 --> 00:11:56,280 Speaker 1: Everything seems like it's just about to everybody's about to 231 00:11:56,280 --> 00:11:59,280 Speaker 1: start melting and screaming. Yeah. I think this is one 232 00:11:59,280 --> 00:12:02,480 Speaker 1: where it was a poor idea in my opinion, and 233 00:12:02,640 --> 00:12:06,080 Speaker 1: uh in the technology was not there to to rescue 234 00:12:06,080 --> 00:12:08,800 Speaker 1: the idea. Now the next one we're going to discuss though. 235 00:12:09,000 --> 00:12:12,040 Speaker 1: I think it was a great idea on paper, but 236 00:12:12,120 --> 00:12:15,319 Speaker 1: it just didn't work out on the screen. And that's 237 00:12:15,320 --> 00:12:18,280 Speaker 1: of course two thousand sevens Baywolf. Now as this Robert 238 00:12:18,360 --> 00:12:21,480 Speaker 1: Zemeckis who did this, Yeah, Robert Zemeckis helmed it. And 239 00:12:21,520 --> 00:12:24,760 Speaker 1: then the writing it was Neil Gaiman and Roger Avery. 240 00:12:24,840 --> 00:12:27,920 Speaker 1: So some you know, some some some big names just 241 00:12:27,960 --> 00:12:31,160 Speaker 1: attached and to the the ideas behind this, uh this movie, 242 00:12:31,160 --> 00:12:34,240 Speaker 1: and of course based on the story of Grindel and Beowulf, 243 00:12:34,280 --> 00:12:37,560 Speaker 1: which is a classic you would think, you know, hard 244 00:12:37,600 --> 00:12:41,760 Speaker 1: to miss action narrative. I think the Beowolf could make 245 00:12:41,760 --> 00:12:44,160 Speaker 1: a really great movie if somebody did it right. Yeah, 246 00:12:44,280 --> 00:12:46,960 Speaker 1: I think so too. I have yet to see that movie. 247 00:12:47,240 --> 00:12:51,319 Speaker 1: But but but certainly has all the the potential in 248 00:12:51,360 --> 00:12:54,040 Speaker 1: the world. And they had a pretty cool vocal cast 249 00:12:54,080 --> 00:12:56,240 Speaker 1: as well, I think. And Angelina Jolie is in it 250 00:12:56,320 --> 00:12:59,439 Speaker 1: as the monster's mother. They have ray Winstone as Bayolwolf. Yeah, 251 00:12:59,480 --> 00:13:02,560 Speaker 1: he does the voice of Beowulf, and uh, and who 252 00:13:02,640 --> 00:13:05,199 Speaker 1: was it that plays the monster? We were just crisp 253 00:13:06,960 --> 00:13:09,520 Speaker 1: now I've seen it all back home. Not one of 254 00:13:09,559 --> 00:13:12,600 Speaker 1: my favorite monster depictions of Grenville, by the way, But 255 00:13:13,200 --> 00:13:15,400 Speaker 1: he's a monster. We can get past that. But everybody 256 00:13:15,400 --> 00:13:18,480 Speaker 1: else in the film really has the uncanny valley that 257 00:13:18,640 --> 00:13:20,640 Speaker 1: going on to to a high degree. I think I 258 00:13:20,640 --> 00:13:24,199 Speaker 1: read a quote somewhere where film critic was talking about 259 00:13:24,240 --> 00:13:27,319 Speaker 1: how the monsters in the movie were only slightly less 260 00:13:27,320 --> 00:13:31,319 Speaker 1: frightening than the humans. Yeah, yeah, the humans. It just 261 00:13:31,440 --> 00:13:35,040 Speaker 1: it just didn't land. Now at this point you're probably thinking, well, 262 00:13:35,400 --> 00:13:38,720 Speaker 1: how about video games, because they're certainly when you're thinking 263 00:13:38,720 --> 00:13:42,560 Speaker 1: about computer animated human beings interacting with each other, staring 264 00:13:42,720 --> 00:13:46,000 Speaker 1: right into the camera, you think of video games. Yeah, 265 00:13:46,040 --> 00:13:49,480 Speaker 1: And and I think, and you know, here's here's the 266 00:13:49,480 --> 00:13:51,360 Speaker 1: thing here. I have to say that I haven't noticed 267 00:13:51,400 --> 00:13:54,040 Speaker 1: it as often these days I think a lot of 268 00:13:54,080 --> 00:13:57,800 Speaker 1: game animators have found ways to get around the uncanny Valley. Yeah. 269 00:13:58,160 --> 00:13:59,800 Speaker 1: I don't want to get to ahead of our flow here, 270 00:13:59,800 --> 00:14:02,920 Speaker 1: but I think one thing that I've noticed they sometimes 271 00:14:02,920 --> 00:14:06,840 Speaker 1: do is that they don't actually go for photo realism, 272 00:14:06,880 --> 00:14:09,120 Speaker 1: and they go for a kind of more real than 273 00:14:09,200 --> 00:14:13,920 Speaker 1: real combination of like a comic book style type character illustration, 274 00:14:14,520 --> 00:14:18,440 Speaker 1: and then these other realistic aspects that when when you 275 00:14:18,480 --> 00:14:21,040 Speaker 1: look at a video game character, you would never mistake 276 00:14:21,040 --> 00:14:24,000 Speaker 1: it for a photograph of a person, even even one 277 00:14:24,080 --> 00:14:27,680 Speaker 1: that's got really good graphics. But much like the way 278 00:14:27,920 --> 00:14:30,480 Speaker 1: dialogue is written in films, you know, you don't want 279 00:14:30,480 --> 00:14:34,080 Speaker 1: to make dialogue sound like real people talk, because that 280 00:14:34,120 --> 00:14:36,680 Speaker 1: would be horrible to listen to, but you do want 281 00:14:36,680 --> 00:14:40,240 Speaker 1: to make it sound quote realistic. You don't want to 282 00:14:40,280 --> 00:14:43,640 Speaker 1: make your characters look too realistic in animation, but you 283 00:14:43,680 --> 00:14:46,640 Speaker 1: do want to make them look quote realistic. In other words, 284 00:14:46,680 --> 00:14:50,160 Speaker 1: they feel real. Yeah, this reminds me of a game 285 00:14:50,200 --> 00:14:52,200 Speaker 1: franchise that I haven't I don't think I've ever played 286 00:14:52,200 --> 00:14:54,520 Speaker 1: more than a demo of this, but the Gears of 287 00:14:54,520 --> 00:14:57,480 Speaker 1: War series. So all the people in this kind of 288 00:14:57,480 --> 00:14:59,600 Speaker 1: look like like if you're gonna be critical, but you 289 00:14:59,640 --> 00:15:02,040 Speaker 1: might say everyone looks kind of like they're weird guerrilla people. 290 00:15:02,280 --> 00:15:04,440 Speaker 1: Like it was a like we're in an alternate world 291 00:15:04,520 --> 00:15:08,880 Speaker 1: where unrealistically huge upper bodies. Yeah, as if evolution took 292 00:15:08,920 --> 00:15:13,520 Speaker 1: a slightly different turn into an intelligent primates. Uh. And 293 00:15:13,640 --> 00:15:16,480 Speaker 1: yet they look real. They don't look like they don't 294 00:15:16,480 --> 00:15:19,000 Speaker 1: get an uncanny effect rolling off them, Like you know, 295 00:15:19,040 --> 00:15:20,760 Speaker 1: you look at them, you can see pores, you can 296 00:15:20,800 --> 00:15:25,680 Speaker 1: see hair follicills. They look real, but they are but 297 00:15:25,720 --> 00:15:29,920 Speaker 1: that they are certainly not going for authentic human being 298 00:15:29,920 --> 00:15:32,200 Speaker 1: there all right, Now, I want to put out one 299 00:15:32,200 --> 00:15:34,480 Speaker 1: more example here before we move on, and it's a 300 00:15:34,600 --> 00:15:39,360 Speaker 1: rare example of uncanny valley avoidance, a very specific type 301 00:15:39,360 --> 00:15:42,840 Speaker 1: of uncanny valley avoidance, and that is au from a 302 00:15:42,880 --> 00:15:46,120 Speaker 1: fantastic stop motion short that was produced by the National 303 00:15:46,160 --> 00:15:48,560 Speaker 1: Film Board of Canada. And you can find this online 304 00:15:48,560 --> 00:15:50,560 Speaker 1: if you just do a search for it. It's Madam 305 00:15:50,640 --> 00:15:54,880 Speaker 1: Tutley Putly and it's a wonderful little little film, very 306 00:15:54,960 --> 00:15:59,320 Speaker 1: very French feel to it. Characters on a train, weirds, 307 00:15:59,320 --> 00:16:02,800 Speaker 1: frightening things occurring. Uh, definitely check it out. But the 308 00:16:03,120 --> 00:16:06,520 Speaker 1: trick to it there, these are stop motion animated characters, 309 00:16:06,880 --> 00:16:10,040 Speaker 1: and their eyes just feel so alive. They stare right 310 00:16:10,080 --> 00:16:12,640 Speaker 1: into you, and you don't you don't question for a 311 00:16:12,680 --> 00:16:15,760 Speaker 1: second that these are that these are people. And the 312 00:16:16,000 --> 00:16:18,600 Speaker 1: trick that they employed is that they used real human eyes, 313 00:16:19,200 --> 00:16:22,280 Speaker 1: not in a you know, depraved, evil puppet master kind 314 00:16:22,320 --> 00:16:25,400 Speaker 1: of way, either. They videotape the eyes of human actors 315 00:16:25,440 --> 00:16:27,800 Speaker 1: and then blended the footage with that of the puppets. 316 00:16:28,240 --> 00:16:31,480 Speaker 1: That sounds like an incredible gambit, because that sounds like 317 00:16:31,520 --> 00:16:34,360 Speaker 1: that could have produced some of the worst Uncanny Valley 318 00:16:34,400 --> 00:16:37,000 Speaker 1: feelings ever if it went wrong. Yeah, and and I 319 00:16:37,040 --> 00:16:38,720 Speaker 1: don't know, there may be some people who watch this 320 00:16:38,840 --> 00:16:41,880 Speaker 1: short and and have the opposite effect and and think 321 00:16:41,880 --> 00:16:44,560 Speaker 1: that it's super creepy. I found it to be like this, 322 00:16:44,560 --> 00:16:48,960 Speaker 1: this interesting example of circumventing the Uncanny Valley. But I'll 323 00:16:49,000 --> 00:16:50,680 Speaker 1: leave it for you guys to decide. I'll include a 324 00:16:50,720 --> 00:16:52,360 Speaker 1: link to this one as well as some of the 325 00:16:52,360 --> 00:16:54,480 Speaker 1: other sources we're talking about on the landing page for 326 00:16:54,520 --> 00:16:56,960 Speaker 1: this episode. It's stuff to blow your mind dot com. 327 00:16:57,000 --> 00:16:58,760 Speaker 1: All right, Well, we are going to take a quick 328 00:16:58,800 --> 00:17:01,040 Speaker 1: break and when we come back will get into the 329 00:17:01,080 --> 00:17:04,120 Speaker 1: origin of the scientific idea of the Uncanny Valley and 330 00:17:04,200 --> 00:17:11,399 Speaker 1: its history and research. All right, we're back. So the 331 00:17:11,480 --> 00:17:13,919 Speaker 1: Uncanny Valley. Where does this even come from? Right? So 332 00:17:13,920 --> 00:17:16,600 Speaker 1: we've already been talking about it because most people have 333 00:17:16,720 --> 00:17:19,639 Speaker 1: heard of this, they're somewhat familiar with it. I was 334 00:17:19,680 --> 00:17:22,679 Speaker 1: talking to Rachel about it though. She was saying, you know, 335 00:17:23,640 --> 00:17:26,600 Speaker 1: at least to her, it had this connotation of just 336 00:17:26,760 --> 00:17:32,040 Speaker 1: generally synthetically generated images being creepy in one way or another. 337 00:17:32,359 --> 00:17:34,480 Speaker 1: So maybe we should get into the specifics of the 338 00:17:34,520 --> 00:17:37,200 Speaker 1: origin of the idea. So let's go back to the 339 00:17:37,280 --> 00:17:42,840 Speaker 1: year nineteen seventy. Everything's great, Wait is it? I don't know, 340 00:17:43,200 --> 00:17:46,639 Speaker 1: but but everybody, everybody's looking forward to the future in 341 00:17:46,760 --> 00:17:49,440 Speaker 1: terms of creating humanoid robots. What are we going to 342 00:17:49,520 --> 00:17:52,879 Speaker 1: be able to do well? The Japanese roboticist massa Hiro 343 00:17:52,960 --> 00:17:57,560 Speaker 1: Mori of the Tokyo Institute of Technology. He wrote a 344 00:17:57,600 --> 00:18:00,760 Speaker 1: paper that was published in this Japanese journal Energy that 345 00:18:00,920 --> 00:18:05,840 Speaker 1: coined the term uncanny Valley to describe a problem that 346 00:18:05,880 --> 00:18:09,359 Speaker 1: he was predicting with increasingly humanoid robots. And this was 347 00:18:09,400 --> 00:18:15,240 Speaker 1: based on just some observations he'd had of of different events. 348 00:18:15,280 --> 00:18:18,000 Speaker 1: So you might say incidents in the progress of designing 349 00:18:18,040 --> 00:18:21,520 Speaker 1: humanoid robots, such as consumer electronics shows in Japan and 350 00:18:21,560 --> 00:18:25,120 Speaker 1: stuff like that. So what he predicted was that as 351 00:18:25,520 --> 00:18:27,920 Speaker 1: you had a humanoid robot, robot that looks like a 352 00:18:28,000 --> 00:18:32,680 Speaker 1: human and its likeness to a human increased, our attitude 353 00:18:32,960 --> 00:18:36,560 Speaker 1: towards them would improve. Our affinity would go up as 354 00:18:36,600 --> 00:18:40,920 Speaker 1: they became more human, until they reached a certain tipping 355 00:18:41,000 --> 00:18:44,879 Speaker 1: point of similarity to humans, where suddenly our affinity, our 356 00:18:44,960 --> 00:18:50,600 Speaker 1: friendly attitude, almost immediately shifts and plunges down into strong revulsion. 357 00:18:51,880 --> 00:18:55,600 Speaker 1: Being human is likable, being sort of human is likable, 358 00:18:55,720 --> 00:19:00,320 Speaker 1: but being almost human is horrible and repulsive. And then 359 00:19:00,320 --> 00:19:03,119 Speaker 1: of course at the final end, uh you you would 360 00:19:03,200 --> 00:19:05,040 Speaker 1: have a real human. So you can think of the 361 00:19:05,119 --> 00:19:08,000 Speaker 1: uncanny valley as a phase in a graph, an X 362 00:19:08,160 --> 00:19:11,480 Speaker 1: Y graph, and along the horizontal axis on the bottom, 363 00:19:11,640 --> 00:19:14,439 Speaker 1: you've got the degree of similarity to a human, and 364 00:19:14,480 --> 00:19:17,000 Speaker 1: then on the vertical axis you've got the degree of 365 00:19:17,040 --> 00:19:20,800 Speaker 1: our affinity for the object. And more hypothesized, this graph 366 00:19:20,800 --> 00:19:23,240 Speaker 1: would have these two peaks. You'd start with zero on 367 00:19:23,320 --> 00:19:26,040 Speaker 1: both axes, because a thing that has no human like 368 00:19:26,200 --> 00:19:29,920 Speaker 1: traits basically gets no human affinity response one way or another. 369 00:19:29,960 --> 00:19:32,119 Speaker 1: And we just don't you know, how much do you 370 00:19:32,200 --> 00:19:35,720 Speaker 1: really like an industrial conveyor belt. You're just sort of 371 00:19:35,760 --> 00:19:39,960 Speaker 1: neutral on it. But as you increase the humanity, you 372 00:19:40,080 --> 00:19:44,720 Speaker 1: give a robot arms or something that looks like a face, eyes, limbs, 373 00:19:45,119 --> 00:19:48,639 Speaker 1: you climb this gentle, gradual slope to the first peak 374 00:19:48,680 --> 00:19:53,240 Speaker 1: and affinity. Um, you know, and he didn't name the peak, 375 00:19:53,240 --> 00:19:55,000 Speaker 1: but I think we should name the peak. I think 376 00:19:55,000 --> 00:19:58,000 Speaker 1: this first peak should be called something like the cuteness peak. 377 00:19:58,119 --> 00:20:01,399 Speaker 1: That's not exactly right because it's not exactly cuteness, but 378 00:20:01,480 --> 00:20:05,920 Speaker 1: it's recognizing something kind of human about what you're looking at. Yeah, Like, 379 00:20:06,680 --> 00:20:08,880 Speaker 1: I mean, we don't have to describe cute to everyone here, 380 00:20:08,920 --> 00:20:11,960 Speaker 1: but certainly this is hello kitty territory, this is the 381 00:20:12,240 --> 00:20:16,120 Speaker 1: this is the domain of large eyed It's vaguely infant 382 00:20:16,200 --> 00:20:19,560 Speaker 1: or kitten like creatures that would never be mistaken for 383 00:20:19,640 --> 00:20:22,200 Speaker 1: human or real, but they resonate with us for a 384 00:20:22,280 --> 00:20:24,359 Speaker 1: number of reasons. We could do a whole podcast infect 385 00:20:24,400 --> 00:20:27,320 Speaker 1: we have an old podcast episode about the science of cute. 386 00:20:27,359 --> 00:20:31,160 Speaker 1: Why that connects with us? Yeah, so they would include 387 00:20:31,200 --> 00:20:33,720 Speaker 1: that would include all kinds of robots that just kind 388 00:20:33,720 --> 00:20:37,520 Speaker 1: of have general, very basic faces that don't try to 389 00:20:37,560 --> 00:20:40,520 Speaker 1: have human skin or anything like that. That just might 390 00:20:40,600 --> 00:20:45,440 Speaker 1: have like kind of a mouth and some cartoonish eyes. Yeah, sure, 391 00:20:45,520 --> 00:20:48,159 Speaker 1: there you go. That the c three po boldly on 392 00:20:48,200 --> 00:20:52,520 Speaker 1: the cuteness. But at a certain point after this first speak, 393 00:20:52,600 --> 00:20:55,480 Speaker 1: this graph drops off steeply. So you keep going along 394 00:20:55,480 --> 00:20:58,680 Speaker 1: the x axis, but then the y axis drops off, 395 00:20:58,760 --> 00:21:01,880 Speaker 1: not just down to zero, but far below zero, into 396 00:21:01,880 --> 00:21:05,199 Speaker 1: the negative affinity range. And this part of the graph 397 00:21:05,280 --> 00:21:08,760 Speaker 1: is the uncanny valley. As the similarity to a real 398 00:21:08,880 --> 00:21:12,520 Speaker 1: human continues to increase near a dent. In other words, 399 00:21:12,720 --> 00:21:16,600 Speaker 1: as it becomes indistinguishable from a real human, our affinity 400 00:21:16,680 --> 00:21:20,280 Speaker 1: sharply shoots back up the second peak toward reality. So 401 00:21:20,320 --> 00:21:23,320 Speaker 1: I'd call this second peak the reality peak. It's when 402 00:21:23,359 --> 00:21:28,480 Speaker 1: you become, for all intents and purposes, a real human being. Yeah. 403 00:21:28,800 --> 00:21:31,640 Speaker 1: I would also say that if if robots were candy, 404 00:21:31,920 --> 00:21:35,240 Speaker 1: the bottom of the uncanny valley would be banana flavored 405 00:21:35,280 --> 00:21:38,320 Speaker 1: candy like that for me has always been a flavor 406 00:21:38,400 --> 00:21:41,040 Speaker 1: where it's like clearly like not only like runts that 407 00:21:41,119 --> 00:21:44,360 Speaker 1: have the bananas. I think, so like like great candy, 408 00:21:44,560 --> 00:21:47,600 Speaker 1: like grape candy doesn't really taste like grapes, but it's 409 00:21:47,800 --> 00:21:50,119 Speaker 1: it's far enough from the Uncanny Valley of candy that 410 00:21:50,160 --> 00:21:53,760 Speaker 1: you're you're okay, whoa, You're right, banana candy actually does 411 00:21:53,880 --> 00:21:57,080 Speaker 1: taste like bananas in a way that makes it not 412 00:21:57,320 --> 00:22:01,720 Speaker 1: really good. Yeah, Like I've candy fans, I don't eat 413 00:22:01,720 --> 00:22:04,639 Speaker 1: that much candy anymore. So maybe the technology has advanced, 414 00:22:04,640 --> 00:22:08,399 Speaker 1: but uh, my memory of the banana candy is is 415 00:22:08,840 --> 00:22:12,359 Speaker 1: that of an uncanny experience. Now, one thing we should 416 00:22:12,359 --> 00:22:14,679 Speaker 1: note is that so this original paper was published in 417 00:22:14,800 --> 00:22:19,439 Speaker 1: nineteen seventy twelve. English translation was published in uh the 418 00:22:19,520 --> 00:22:22,280 Speaker 1: I Triple A Robotics and Automation magazine. And that's what 419 00:22:22,320 --> 00:22:24,760 Speaker 1: I was using is my reference, that English translation from 420 00:22:26,200 --> 00:22:29,120 Speaker 1: uh and so it has some graphs here. It has 421 00:22:29,160 --> 00:22:32,679 Speaker 1: Morey's original graphs or interpretations of them, and we can 422 00:22:32,720 --> 00:22:35,280 Speaker 1: get into a little more detail on the nuances of 423 00:22:35,320 --> 00:22:37,640 Speaker 1: Morey's theory. But one thing I did read was that 424 00:22:38,480 --> 00:22:43,000 Speaker 1: many years later, somebody contacted more and he and they 425 00:22:43,040 --> 00:22:45,240 Speaker 1: were talking to him about this idea he had of 426 00:22:45,280 --> 00:22:47,719 Speaker 1: the uncanny Valley, and they were like, well, does does 427 00:22:47,760 --> 00:22:51,040 Speaker 1: anything lie beyond the peak of reality? And he said, hey, 428 00:22:51,040 --> 00:22:53,919 Speaker 1: oh yeah, actually there is such a thing. And he 429 00:22:54,000 --> 00:22:56,400 Speaker 1: said you know, beyond the real human, you'd have sort 430 00:22:56,440 --> 00:23:03,000 Speaker 1: of like artistic ideals, like the realm of forms even right. Yeah, 431 00:23:03,040 --> 00:23:05,199 Speaker 1: So well, I think he used an example of like 432 00:23:05,240 --> 00:23:08,400 Speaker 1: a statue of Buddha, you know, a beautiful, perfect statue 433 00:23:08,400 --> 00:23:11,359 Speaker 1: of Buddha. It's almost like it we have greater affinity 434 00:23:11,400 --> 00:23:15,280 Speaker 1: for it than we have for a realistic human because 435 00:23:15,280 --> 00:23:18,440 Speaker 1: we've been, well, to a large point, we've been conditioned 436 00:23:18,480 --> 00:23:20,960 Speaker 1: to right. Yeah, that that kind of gets into this, 437 00:23:20,960 --> 00:23:24,159 Speaker 1: this idea of conditioned familiarity that we not only have 438 00:23:24,240 --> 00:23:27,960 Speaker 1: with my religious icons, but also with pop culture icons, 439 00:23:28,320 --> 00:23:31,200 Speaker 1: so not only the Buddha, but also Robbie the robot, 440 00:23:31,800 --> 00:23:36,040 Speaker 1: or or even the Terminator or well, yeah, that does 441 00:23:36,160 --> 00:23:38,960 Speaker 1: make me think that in some ways, if if aesthetic 442 00:23:39,000 --> 00:23:41,800 Speaker 1: ideals and things that were familiar with through our culture 443 00:23:42,240 --> 00:23:45,080 Speaker 1: might be even beyond humans. I mean, again, this is 444 00:23:45,119 --> 00:23:47,840 Speaker 1: not like rigorous research, This is just what more he says, 445 00:23:47,880 --> 00:23:51,119 Speaker 1: he thinks, who predicts, Uh, could could there be like 446 00:23:51,160 --> 00:23:54,480 Speaker 1: a robot that we really love that's actually better than 447 00:23:54,520 --> 00:23:56,960 Speaker 1: a than a normal human? Well, you know, there's a 448 00:23:57,040 --> 00:23:59,920 Speaker 1: study that came out last year, I believe from Penn 449 00:24:00,000 --> 00:24:02,520 Speaker 1: State University that was kind of interesting alone. These lines. 450 00:24:02,840 --> 00:24:06,679 Speaker 1: So the researchers survey three seventy nine adults ages ages 451 00:24:06,760 --> 00:24:09,680 Speaker 1: sixty to eighty six, and they asked them for specific 452 00:24:09,720 --> 00:24:13,240 Speaker 1: memories of robot films they'd seen in their general attitudes 453 00:24:13,320 --> 00:24:16,480 Speaker 1: towards robots and and the you know, with the age here, 454 00:24:16,560 --> 00:24:20,720 Speaker 1: as you might imagine, they're really looking at at potential 455 00:24:20,760 --> 00:24:22,600 Speaker 1: care robots, like the idea of like what kind of 456 00:24:22,680 --> 00:24:24,680 Speaker 1: robots should help you use the bathroom? Do you want 457 00:24:24,720 --> 00:24:26,960 Speaker 1: something that looks like kind of like a person, or 458 00:24:27,000 --> 00:24:29,840 Speaker 1: do you want something that looks like a forklift with 459 00:24:30,480 --> 00:24:32,840 Speaker 1: a forklift mated with an easy chair. But if you 460 00:24:32,880 --> 00:24:36,000 Speaker 1: look at the ages used here and twenty sixteen, when 461 00:24:36,040 --> 00:24:38,520 Speaker 1: the study took place, you can say that these people 462 00:24:38,640 --> 00:24:41,080 Speaker 1: grew up with science fiction. Oh yeah, I mean they 463 00:24:41,160 --> 00:24:44,159 Speaker 1: might not have personally consumed a lot of it, but 464 00:24:44,200 --> 00:24:46,720 Speaker 1: it's in the culture, right, Yeah. They they definitely had 465 00:24:46,760 --> 00:24:49,760 Speaker 1: access to it. And the researchers found that individuals who 466 00:24:49,760 --> 00:24:53,919 Speaker 1: could recall more cinematic robot portrayals were increasingly likely to 467 00:24:53,960 --> 00:24:56,560 Speaker 1: hold positive attitudes towards robots in general. So it didn't 468 00:24:56,600 --> 00:25:01,320 Speaker 1: matter if they remembered murderous killbots or well meaning hell probots. Uh, 469 00:25:01,400 --> 00:25:05,880 Speaker 1: they need the mere memory of multiple robotic portrayals correlated 470 00:25:05,880 --> 00:25:09,920 Speaker 1: to pro robot vibes, so to study findings. They also 471 00:25:09,960 --> 00:25:14,520 Speaker 1: backed up the importance of human looking human asque robots 472 00:25:14,760 --> 00:25:18,159 Speaker 1: to invoke a sympathetic user response. But the researcher stress 473 00:25:18,200 --> 00:25:21,720 Speaker 1: that robot designers might want to incorporate robotic features that 474 00:25:21,800 --> 00:25:25,760 Speaker 1: older adults will remember from their cinematic past. So it's 475 00:25:25,800 --> 00:25:28,560 Speaker 1: saying that, like, don't just try to make it like 476 00:25:28,600 --> 00:25:30,800 Speaker 1: a human. Try to make it like the robots we 477 00:25:30,840 --> 00:25:33,639 Speaker 1: have known and loved. Yeah, like make it fun. You know, 478 00:25:34,680 --> 00:25:36,320 Speaker 1: if I'm if I need a robot to help me 479 00:25:36,359 --> 00:25:38,840 Speaker 1: go to the bathroom, may make it. Make make them 480 00:25:38,840 --> 00:25:41,119 Speaker 1: the robots from Silent Running, you know, Huie, Dowie and 481 00:25:41,119 --> 00:25:43,760 Speaker 1: Louie a little little guys. Then at least I can 482 00:25:43,800 --> 00:25:46,840 Speaker 1: engage my nostalgia a little bit totally. So I want 483 00:25:46,840 --> 00:25:48,959 Speaker 1: to look at a few more nuances from Maury's original 484 00:25:49,000 --> 00:25:51,440 Speaker 1: paper in nineteen seventy. So one thing I do think 485 00:25:51,440 --> 00:25:53,239 Speaker 1: it's very interesting and I want to come back to 486 00:25:53,400 --> 00:25:57,359 Speaker 1: as we explore this topic more. More hypothesizes in the 487 00:25:57,359 --> 00:26:00,600 Speaker 1: original paper that our perception of an un handy valley 488 00:26:00,680 --> 00:26:04,119 Speaker 1: might depend on the context in which we're we're viewing 489 00:26:04,160 --> 00:26:06,800 Speaker 1: the being. And the example he gives here is he's 490 00:26:06,800 --> 00:26:10,720 Speaker 1: talking about buon Rocko puppets, and so he says, quote, 491 00:26:10,760 --> 00:26:13,480 Speaker 1: I don't think that on close inspection of bun Rocku 492 00:26:13,520 --> 00:26:17,600 Speaker 1: puppet appears similar to a human being. But when we 493 00:26:17,720 --> 00:26:20,480 Speaker 1: enjoy a puppet show in the theater, were seated at 494 00:26:20,520 --> 00:26:24,080 Speaker 1: a certain distance from the stage, the puppets absolute size 495 00:26:24,200 --> 00:26:28,280 Speaker 1: is ignored. Its total appearance, including hand and eye movements, 496 00:26:28,640 --> 00:26:31,040 Speaker 1: is close to that of a human being. So, given 497 00:26:31,080 --> 00:26:34,360 Speaker 1: our tendency as an audience to become absorbed in this 498 00:26:34,440 --> 00:26:37,240 Speaker 1: form of art, we might feel a high level of 499 00:26:37,280 --> 00:26:40,960 Speaker 1: affinity for the puppet. I think that's interesting. So it's 500 00:26:41,040 --> 00:26:44,119 Speaker 1: it's not just the object, but it's also the context 501 00:26:44,160 --> 00:26:46,840 Speaker 1: in which we experience the object. You might have very 502 00:26:46,840 --> 00:26:50,159 Speaker 1: different feelings about a bun Rocku puppet lying on the 503 00:26:50,160 --> 00:26:54,160 Speaker 1: floor versus one that you go to see in the 504 00:26:54,200 --> 00:26:57,040 Speaker 1: context of staging a play. Yeah. I think that the 505 00:26:57,080 --> 00:27:00,679 Speaker 1: puppet argument is something to keep in mind throughout considerations 506 00:27:00,680 --> 00:27:02,359 Speaker 1: of the Uncanny Valley because there are a lot of 507 00:27:02,359 --> 00:27:03,720 Speaker 1: people that there are a lot of people who have 508 00:27:03,800 --> 00:27:06,840 Speaker 1: kind of um an irrational version to puppets in general, 509 00:27:06,880 --> 00:27:09,960 Speaker 1: and certainly if you take just a still puppet and 510 00:27:09,960 --> 00:27:12,399 Speaker 1: you hold it up, there are various puppets that one 511 00:27:12,480 --> 00:27:15,600 Speaker 1: might find a little bit uncanny or creepy, etcetera. But 512 00:27:15,640 --> 00:27:19,159 Speaker 1: in the process of performing with a talented performer is 513 00:27:19,240 --> 00:27:21,720 Speaker 1: going to bring that to life. Like that's the art form. 514 00:27:22,040 --> 00:27:25,280 Speaker 1: And and there's so many different varieties of puppetry. Certainly 515 00:27:25,280 --> 00:27:30,560 Speaker 1: they're they're too broad categories are it's situations where the 516 00:27:30,600 --> 00:27:33,919 Speaker 1: puppeteer is visible and puppet situation where puppeteer is not. 517 00:27:34,040 --> 00:27:36,200 Speaker 1: You know, so you have your basic the Muppets situation 518 00:27:36,320 --> 00:27:38,520 Speaker 1: where you don't see the puppeteers, but there are plenty 519 00:27:38,560 --> 00:27:42,480 Speaker 1: of art forms of puppetry performance styles in which the 520 00:27:42,520 --> 00:27:46,280 Speaker 1: puppeteer is very visible, either completely or just their face. 521 00:27:46,440 --> 00:27:48,480 Speaker 1: You see their eyes, you see that there's a person 522 00:27:48,520 --> 00:27:51,920 Speaker 1: involved here, and uh, and there's not this this mystery 523 00:27:52,000 --> 00:27:55,399 Speaker 1: or this sense of deception, right, yeah, I think conceptual 524 00:27:55,760 --> 00:27:59,159 Speaker 1: clues like that are very important. Also is when you 525 00:27:59,200 --> 00:28:01,880 Speaker 1: consider the the the idea of going to a puppet theater, 526 00:28:02,359 --> 00:28:07,440 Speaker 1: it also includes a certain attitude charging effect in the audience, 527 00:28:07,880 --> 00:28:10,720 Speaker 1: Like an an audience member goes to a puppet theater 528 00:28:11,160 --> 00:28:14,840 Speaker 1: prepared to suspend their disbelief, like you know what I mean, 529 00:28:14,960 --> 00:28:17,920 Speaker 1: Like you put yourself in an intentional state of open 530 00:28:17,960 --> 00:28:21,760 Speaker 1: mindedness about what you're viewing, and you give yourself an 531 00:28:21,760 --> 00:28:24,959 Speaker 1: interpretive framework through which to Like if you were not 532 00:28:25,119 --> 00:28:28,600 Speaker 1: prepared to watch a puppet theater story and suddenly a 533 00:28:28,640 --> 00:28:31,840 Speaker 1: puppet was just moving around, that might be a lot creepier. 534 00:28:33,240 --> 00:28:35,920 Speaker 1: So part of the Uncanny Valley effect is probably also 535 00:28:36,080 --> 00:28:39,080 Speaker 1: in the viewer themselves and in the so the context 536 00:28:39,160 --> 00:28:41,560 Speaker 1: is not just where you are, what's going on with 537 00:28:41,560 --> 00:28:45,400 Speaker 1: what you're looking at, but what you're expecting to see. Now. 538 00:28:45,480 --> 00:28:48,880 Speaker 1: One more thing that Morey points out is he thinks 539 00:28:48,920 --> 00:28:51,480 Speaker 1: that they're going to be very different rules governing the 540 00:28:51,600 --> 00:28:57,000 Speaker 1: Uncanny Valley for still objects versus moving objects. And essentially 541 00:28:57,000 --> 00:29:00,960 Speaker 1: his hypothesis is that movement is going to amplify both 542 00:29:01,000 --> 00:29:04,239 Speaker 1: the peaks and the valleys of the graph. So if 543 00:29:04,240 --> 00:29:07,080 Speaker 1: you imagine the graph we said earlier, gentle slope up 544 00:29:07,080 --> 00:29:09,960 Speaker 1: to first peak, you know, kind of cute whatever has 545 00:29:10,040 --> 00:29:13,440 Speaker 1: some human characteristics, then a dip down into too close 546 00:29:13,480 --> 00:29:16,280 Speaker 1: to human but not there, and then a final rise 547 00:29:16,360 --> 00:29:20,080 Speaker 1: up to actually human. He he would say if it's moving, 548 00:29:20,200 --> 00:29:22,360 Speaker 1: the peaks are going to be higher and the valleys 549 00:29:22,400 --> 00:29:25,520 Speaker 1: are going to be lower. Okay, so a thing that 550 00:29:25,720 --> 00:29:28,800 Speaker 1: is moving gets greater affinity if it's good if it's 551 00:29:28,840 --> 00:29:31,560 Speaker 1: at one of these two peaks, but it's even more 552 00:29:31,600 --> 00:29:35,160 Speaker 1: revolting and unpleasant if it's at the valley. Uh So 553 00:29:35,320 --> 00:29:38,280 Speaker 1: this this makes me think of Samara in The Ring, 554 00:29:38,920 --> 00:29:41,320 Speaker 1: those scenes where Samara is emerging from the TV or 555 00:29:41,320 --> 00:29:45,240 Speaker 1: the well, her movement is is jerky and and I 556 00:29:45,320 --> 00:29:48,400 Speaker 1: understand that they created that effect by having the actor 557 00:29:48,600 --> 00:29:52,080 Speaker 1: or actress walk backwards and then reversing the footage. So 558 00:29:52,160 --> 00:29:55,120 Speaker 1: you have this, you have this this movement that is 559 00:29:55,640 --> 00:29:58,600 Speaker 1: you know, natural, but being reversed it it feels very 560 00:29:58,680 --> 00:30:01,960 Speaker 1: unnatural and and it's hard to really pinpoint what's not 561 00:30:02,120 --> 00:30:05,800 Speaker 1: working for you about it. Right. So More in the 562 00:30:05,920 --> 00:30:09,520 Speaker 1: end concludes he gives this recommendation based on his hypothesis. 563 00:30:09,520 --> 00:30:12,760 Speaker 1: He says, don't go for realism, right, It's gonna be 564 00:30:12,880 --> 00:30:16,239 Speaker 1: so hard if you're designing a humanoid robot. Now, a 565 00:30:16,240 --> 00:30:18,840 Speaker 1: lot of what we're we're talking about in these episodes 566 00:30:18,960 --> 00:30:22,520 Speaker 1: is animation. He's talking primarily about humanoid robots, but typically 567 00:30:22,560 --> 00:30:26,600 Speaker 1: these uh two fields get somewhat conflated in discussion of 568 00:30:26,640 --> 00:30:28,760 Speaker 1: the Uncanny Valley, because in both cases you're trying to 569 00:30:28,760 --> 00:30:32,720 Speaker 1: create something that looks pleasingly human. Um. He wrote, it's 570 00:30:32,720 --> 00:30:35,240 Speaker 1: gonna be so hard to get out of the valley 571 00:30:35,320 --> 00:30:38,280 Speaker 1: up the second peak, that that's the reality peak is 572 00:30:38,840 --> 00:30:42,800 Speaker 1: so steep. Instead, roboticists should not try, and instead they 573 00:30:42,800 --> 00:30:45,600 Speaker 1: should aim for the very tip of the first peak. 574 00:30:46,160 --> 00:30:48,440 Speaker 1: Stick on the cuteness peak because we know we can 575 00:30:48,440 --> 00:30:52,880 Speaker 1: get there. Think think Wally or other cute humanoid robots. 576 00:30:52,920 --> 00:30:55,520 Speaker 1: The first peak is not really that hard to attain. 577 00:30:55,920 --> 00:30:58,200 Speaker 1: People respond well to it, So why do you need 578 00:30:58,240 --> 00:31:01,240 Speaker 1: to try to go past it? Um? You know. As 579 00:31:01,280 --> 00:31:04,760 Speaker 1: for animated humans, I think a good analogy might be here. 580 00:31:04,840 --> 00:31:09,040 Speaker 1: Here's one Pixars The Incredibles versus Final Fantasy The Spirits, 581 00:31:09,080 --> 00:31:12,160 Speaker 1: within which we already mentioned. The former they don't look 582 00:31:12,200 --> 00:31:15,400 Speaker 1: like real humans at all, right, they're cute, cartoonish, non 583 00:31:15,520 --> 00:31:19,080 Speaker 1: realistic humans, but they're quite pleasant. The latter goes for 584 00:31:19,400 --> 00:31:23,840 Speaker 1: and fails at photo realism and creates these characters that 585 00:31:23,880 --> 00:31:26,880 Speaker 1: are stiff and unsettling. In other words, he says, don't 586 00:31:26,880 --> 00:31:28,880 Speaker 1: try to climb out of the valley. Just don't go 587 00:31:28,920 --> 00:31:32,040 Speaker 1: into the valley to begin with. Yeah, this this really 588 00:31:32,040 --> 00:31:34,520 Speaker 1: brings to mind just the idea of like filmmakers and 589 00:31:34,600 --> 00:31:38,320 Speaker 1: creators standing on the the the edge of this physical 590 00:31:38,440 --> 00:31:40,880 Speaker 1: valley and there's a local guide. They're saying, don't do it. 591 00:31:41,160 --> 00:31:43,520 Speaker 1: Don't do it, the value will consume you. And they're like, no, 592 00:31:43,600 --> 00:31:45,880 Speaker 1: we're Lucasfilm. We can do it. Yeah, we got all 593 00:31:45,880 --> 00:31:49,120 Speaker 1: this high text gear. There's no way that anything's gonna 594 00:31:49,160 --> 00:31:50,840 Speaker 1: take us down there. And then they go down there 595 00:31:50,840 --> 00:31:54,040 Speaker 1: and it's just Jurassic Parker congo with They just get 596 00:31:54,080 --> 00:31:57,720 Speaker 1: torn apart. You know, I do. I do think the 597 00:31:58,120 --> 00:32:01,360 Speaker 1: dinosaurs in Jurassic Park come out of the uncanny valley 598 00:32:01,360 --> 00:32:04,760 Speaker 1: for dinosaurs. They do. Yeah, And that introduces an interesting 599 00:32:04,760 --> 00:32:08,720 Speaker 1: wrinkle in that, like Maury is talking about human wide qualities, 600 00:32:09,280 --> 00:32:13,080 Speaker 1: it would probably be a related but different thing to 601 00:32:13,200 --> 00:32:19,880 Speaker 1: just say animal reality versus specifically human reality. Yeah, because 602 00:32:20,000 --> 00:32:23,000 Speaker 1: I mean when for non human creatures, certainly we've been 603 00:32:23,000 --> 00:32:26,000 Speaker 1: able to nail that. For ages to stop motion creatures, 604 00:32:26,040 --> 00:32:28,560 Speaker 1: often even if their movements are kind of herky jerky, 605 00:32:28,600 --> 00:32:33,080 Speaker 1: they feel great. Um that heard of stop motion robots 606 00:32:33,520 --> 00:32:36,400 Speaker 1: in older films, So that I've never had a problem 607 00:32:36,440 --> 00:32:39,200 Speaker 1: buying into them, and yeah, you're look into the eye 608 00:32:39,240 --> 00:32:41,840 Speaker 1: of the track or the t rex or the velociraptor 609 00:32:42,000 --> 00:32:44,800 Speaker 1: and you never doubt for a second. Yeah. But so 610 00:32:44,920 --> 00:32:46,760 Speaker 1: one thing I think we should point out is that 611 00:32:46,960 --> 00:32:50,800 Speaker 1: as prescient as Mori was of what would become this 612 00:32:51,080 --> 00:32:55,120 Speaker 1: widely recognized pop culture phenomenon his paper, it's not it's 613 00:32:55,120 --> 00:32:59,760 Speaker 1: not research, really, it's just sort of observation and interesting speculation. 614 00:33:00,400 --> 00:33:02,840 Speaker 1: So what we should shift now to do, I think, 615 00:33:02,960 --> 00:33:05,760 Speaker 1: is talk about whether there's really any evidence that the 616 00:33:05,840 --> 00:33:09,160 Speaker 1: Uncanny Valley number one exists at all? Is it really 617 00:33:09,160 --> 00:33:12,400 Speaker 1: a thing number two? Is it a is it a 618 00:33:12,480 --> 00:33:15,440 Speaker 1: unified phenomenon, or is it there there's some separate things 619 00:33:15,480 --> 00:33:18,680 Speaker 1: getting pulled into the net together here. And then finally 620 00:33:18,720 --> 00:33:21,560 Speaker 1: maybe we should look at if it's real, what causes it? 621 00:33:22,680 --> 00:33:25,920 Speaker 1: Why do human brains tend to react this way? So 622 00:33:25,960 --> 00:33:27,920 Speaker 1: maybe we should take a quick break, and then when 623 00:33:27,960 --> 00:33:35,800 Speaker 1: we come back we will get into more recent research. Thank. Okay, 624 00:33:35,840 --> 00:33:39,160 Speaker 1: So there's there's really no denying that there is some 625 00:33:39,280 --> 00:33:43,720 Speaker 1: kind of creepy humanoid synthetic figure effect. Right. We we've 626 00:33:43,760 --> 00:33:45,880 Speaker 1: all seen these c g I movies, We've all seen 627 00:33:45,880 --> 00:33:49,640 Speaker 1: these creepy robots and had that feeling of don't like it. 628 00:33:50,320 --> 00:33:55,000 Speaker 1: But that doesn't necessarily mean that the uncanny Valley, as 629 00:33:55,320 --> 00:33:59,640 Speaker 1: described by Mori or as popularly conceived in culture, is 630 00:33:59,720 --> 00:34:03,600 Speaker 1: in fact a correct description of what's happening there. Right 631 00:34:03,680 --> 00:34:06,720 Speaker 1: just because it it feels truthy, just because it lines 632 00:34:06,840 --> 00:34:09,560 Speaker 1: up with to a certain degree with how we feel 633 00:34:09,560 --> 00:34:12,239 Speaker 1: about the world, doesn't mean that it is. You know, 634 00:34:12,280 --> 00:34:15,960 Speaker 1: that it is an actual effect that's taking place, and 635 00:34:16,280 --> 00:34:18,640 Speaker 1: or that it's even a fixed effect, etcetera. There are 636 00:34:18,560 --> 00:34:21,560 Speaker 1: a lot of factors to contemplate here, Like for my 637 00:34:21,600 --> 00:34:24,800 Speaker 1: own part, I've always found it interesting and I definitely 638 00:34:25,120 --> 00:34:29,120 Speaker 1: think there's something to it. However, you line it up 639 00:34:29,160 --> 00:34:33,719 Speaker 1: with similar cases in life, such as say, individuals that 640 00:34:33,760 --> 00:34:37,759 Speaker 1: you may encounter who have some degree of facial disfigurement, 641 00:34:37,800 --> 00:34:40,160 Speaker 1: and it might be extremely mild. It might be it 642 00:34:40,239 --> 00:34:42,960 Speaker 1: might be nothing more than a uh in then you 643 00:34:43,000 --> 00:34:45,759 Speaker 1: know a lazy eye, or or you know some sort 644 00:34:45,760 --> 00:34:49,600 Speaker 1: of cleft lip or cleft palate to scenario it, or 645 00:34:49,640 --> 00:34:52,080 Speaker 1: it just might be like their faces maybe not all 646 00:34:52,120 --> 00:34:56,680 Speaker 1: that symmetrical, and you know nobody whose face is perfectly symmetrical. 647 00:34:57,800 --> 00:35:01,399 Speaker 1: But with all of these individual rules, you interact with them, 648 00:35:01,400 --> 00:35:04,960 Speaker 1: you get to know them maybe, and whatever kind of 649 00:35:05,000 --> 00:35:09,640 Speaker 1: like initial um reaction is present, be it just kind 650 00:35:09,640 --> 00:35:13,200 Speaker 1: of a huh or that goes away, and you can 651 00:35:13,280 --> 00:35:15,640 Speaker 1: unless you're a total jerk, unless you're a total jerk, 652 00:35:15,800 --> 00:35:18,440 Speaker 1: or you're gonna be able to relate to that person. 653 00:35:18,480 --> 00:35:20,600 Speaker 1: You're gonna be able to communicate with that person, and 654 00:35:20,640 --> 00:35:22,640 Speaker 1: you're not going to be thrown for a curve every 655 00:35:22,680 --> 00:35:25,080 Speaker 1: time they make eye contact with you. Yeah, I would 656 00:35:25,120 --> 00:35:27,920 Speaker 1: agree with that. So there is certainly, like in Maury's 657 00:35:27,920 --> 00:35:32,680 Speaker 1: original formulation, he he would, I think, put different kinds 658 00:35:32,719 --> 00:35:37,520 Speaker 1: of um physical abnormality somewhere on the ascending slope, on 659 00:35:37,600 --> 00:35:41,279 Speaker 1: the on the on the uncanny valley slope. So you 660 00:35:41,360 --> 00:35:43,759 Speaker 1: have a normal, healthy human up at the peak, I 661 00:35:43,800 --> 00:35:46,560 Speaker 1: guess somewhere below the artistic ideal of the great Buddhist 662 00:35:46,560 --> 00:35:49,759 Speaker 1: statue or something. But you'd have normal, healthy human. Then 663 00:35:49,800 --> 00:35:52,520 Speaker 1: somewhere below them would be people who have who look 664 00:35:52,640 --> 00:35:56,359 Speaker 1: like there is something wrong with them in terms of 665 00:35:56,400 --> 00:35:59,759 Speaker 1: having uh, you know, perfect health and symmetricality. I mean 666 00:35:59,760 --> 00:36:02,480 Speaker 1: so because just an ill person, you encounter someone who 667 00:36:02,719 --> 00:36:05,080 Speaker 1: is clearly a little bit sick or a little bit 668 00:36:05,120 --> 00:36:08,320 Speaker 1: hungover or whatever. You can tell and it it causes 669 00:36:08,360 --> 00:36:10,640 Speaker 1: a light to go off in your head. Yeah. Uh. 670 00:36:10,680 --> 00:36:14,360 Speaker 1: And and yet we can quite easily adapt to people 671 00:36:14,680 --> 00:36:17,080 Speaker 1: like you know, you see somebody like that, it is 672 00:36:17,200 --> 00:36:20,319 Speaker 1: you just know, it is not proper to react to 673 00:36:20,360 --> 00:36:23,440 Speaker 1: somebody with revulsion. Oh yeah, like right now in Atlanta 674 00:36:23,719 --> 00:36:27,080 Speaker 1: as we're recording, this pollen is everywhere. So there are 675 00:36:27,120 --> 00:36:29,560 Speaker 1: several people in my life. I'm not really affected by 676 00:36:29,600 --> 00:36:33,319 Speaker 1: the pollen so much, but it totally debilitates some of 677 00:36:33,360 --> 00:36:37,759 Speaker 1: my coworkers, some of my friends and red face puffy eyes. Yeah. 678 00:36:37,840 --> 00:36:40,960 Speaker 1: And and sometimes they're like walked out on allergy medication 679 00:36:41,000 --> 00:36:44,360 Speaker 1: to boot and you just get you just you know, 680 00:36:44,400 --> 00:36:47,040 Speaker 1: you accept it. You realize, oh, well, you know, my 681 00:36:47,040 --> 00:36:48,880 Speaker 1: my friend here is going to be kind of a 682 00:36:48,880 --> 00:36:51,840 Speaker 1: pollen zombie for a couple of weeks. But that doesn't 683 00:36:51,840 --> 00:36:54,560 Speaker 1: mean we can't hang out. It doesn't mean we can't 684 00:36:54,560 --> 00:36:58,600 Speaker 1: work on this or that. Yeah. So definitely, The Uncanny 685 00:36:58,680 --> 00:37:02,480 Speaker 1: Valley has plenty of critics, and plenty I think a 686 00:37:02,600 --> 00:37:05,480 Speaker 1: very fair criticisms leveled at it. I just want to 687 00:37:05,520 --> 00:37:08,440 Speaker 1: go back to one popular article I came across a 688 00:37:08,480 --> 00:37:12,160 Speaker 1: two thousand ten article in Popular Mechanics by Eric Softge, 689 00:37:13,040 --> 00:37:15,280 Speaker 1: where he sort of points out that at the time, 690 00:37:15,640 --> 00:37:18,120 Speaker 1: people were, as I think they are still now treating 691 00:37:18,160 --> 00:37:21,640 Speaker 1: the Uncanny Valley as a proven fact, but in fact, 692 00:37:21,800 --> 00:37:24,360 Speaker 1: at the time, he says, you know, there's really almost 693 00:37:24,440 --> 00:37:28,000 Speaker 1: no convincing evidence that such a thing even exists, and 694 00:37:28,120 --> 00:37:31,640 Speaker 1: he speaks to an expert named Carl McDorman, director of 695 00:37:31,680 --> 00:37:35,560 Speaker 1: the Android Science Center at Indiana University, and McDorman, who 696 00:37:35,719 --> 00:37:38,799 Speaker 1: has conducted research on the valley, offered his opinion in 697 00:37:38,840 --> 00:37:41,560 Speaker 1: the article, saying quote, it turns out that there may 698 00:37:41,560 --> 00:37:44,760 Speaker 1: be more than one Uncanny Valley. It's not the overall 699 00:37:44,800 --> 00:37:47,920 Speaker 1: degree of human likeness that makes a robot or animated 700 00:37:48,000 --> 00:37:51,520 Speaker 1: character uncanny. It's more a matter of mismatch. If you 701 00:37:51,600 --> 00:37:55,000 Speaker 1: have an extremely realistic skin texture, but at the same 702 00:37:55,040 --> 00:37:59,800 Speaker 1: time cartoonish eyes or realistic eyes and an unrealistic skin texture, 703 00:38:00,280 --> 00:38:04,160 Speaker 1: that's very uncanny, uh and the art. So that's an 704 00:38:04,200 --> 00:38:06,840 Speaker 1: idea about the perceptual mismatch that I do want to 705 00:38:06,880 --> 00:38:10,520 Speaker 1: revisit later in this episode. But the article also speaks 706 00:38:10,520 --> 00:38:14,000 Speaker 1: to a guy named David Hanson who's a roboticist who 707 00:38:14,040 --> 00:38:18,160 Speaker 1: specifically specializes in creating very realistic humanoid robots. I think 708 00:38:18,160 --> 00:38:22,560 Speaker 1: he did that that Einstein head thing. Oh yeah, nothing. Uh. So, 709 00:38:22,640 --> 00:38:26,280 Speaker 1: Hansen claims that even if people find overly realistic robots 710 00:38:26,320 --> 00:38:29,280 Speaker 1: creepy at first, they get used to them within minutes. 711 00:38:29,360 --> 00:38:31,080 Speaker 1: This is sort of what you were just talking about. 712 00:38:31,120 --> 00:38:34,760 Speaker 1: I think, you know, you become acclimated even to something 713 00:38:34,840 --> 00:38:37,720 Speaker 1: that you might uh at some kind of base level, 714 00:38:37,840 --> 00:38:41,960 Speaker 1: have a negative reaction to. Yeah. I keep thinking of 715 00:38:41,960 --> 00:38:44,120 Speaker 1: aving in isolation in this because it's the game I'm 716 00:38:44,160 --> 00:38:46,160 Speaker 1: currently playing, and uh, and I feel like that the 717 00:38:46,200 --> 00:38:48,759 Speaker 1: c g I characters are are are pretty well done 718 00:38:48,760 --> 00:38:51,680 Speaker 1: in there. I haven't felt that the tinge of of 719 00:38:52,080 --> 00:38:55,000 Speaker 1: Uncanny Valley so washing over me. Some of the voice 720 00:38:55,000 --> 00:38:58,440 Speaker 1: actings a little weak. But but but speaking of the voice, 721 00:38:58,560 --> 00:39:01,640 Speaker 1: like the that the androids you encounter though with the 722 00:39:01,680 --> 00:39:06,319 Speaker 1: sexs and uh androids that key. Yeah, and when I 723 00:39:06,320 --> 00:39:08,680 Speaker 1: first on Canny Valley, well yeah, but when I first 724 00:39:08,800 --> 00:39:11,400 Speaker 1: encountered them, yeah, they had the uncanny intentionally kind of 725 00:39:11,440 --> 00:39:15,359 Speaker 1: creepy appearance and the very creepy robot voice. But yet 726 00:39:15,640 --> 00:39:19,040 Speaker 1: when they were not actively attacking me, I kind of was. 727 00:39:19,160 --> 00:39:20,800 Speaker 1: I was kind of cool with it. It wasn't until 728 00:39:20,840 --> 00:39:24,719 Speaker 1: heded becoming violent. That that that the mere sound of 729 00:39:24,760 --> 00:39:27,400 Speaker 1: their voice or the appearance of one UH down the 730 00:39:27,560 --> 00:39:30,040 Speaker 1: you know, in the distance down the hallway, would would 731 00:39:30,080 --> 00:39:32,680 Speaker 1: cause my nerves to react. I mean, those things are 732 00:39:32,680 --> 00:39:35,680 Speaker 1: funny there, h They're a good part of that game. 733 00:39:36,640 --> 00:39:40,040 Speaker 1: But anyway, so in this UH article, the author also 734 00:39:40,120 --> 00:39:43,560 Speaker 1: cites some other unnamed robot roboticists, as well as his 735 00:39:43,600 --> 00:39:47,520 Speaker 1: own experience when he's talking about meeting robots that he 736 00:39:47,560 --> 00:39:50,560 Speaker 1: had previously seen on video, and one thing he says is, 737 00:39:50,680 --> 00:39:53,880 Speaker 1: you know, an Uncanny Valley effect that was present when 738 00:39:53,920 --> 00:39:55,880 Speaker 1: I saw a video of this robot went away when 739 00:39:55,920 --> 00:39:58,560 Speaker 1: I saw it in person. I don't know if that's 740 00:39:58,600 --> 00:40:01,919 Speaker 1: generally true of people, he claims it's true. But even 741 00:40:01,960 --> 00:40:04,200 Speaker 1: if this is truly the case for robots, I'm not 742 00:40:04,200 --> 00:40:07,160 Speaker 1: sure how it would apply to animations. Probably wouldn't apply 743 00:40:07,239 --> 00:40:11,520 Speaker 1: to animations. UM. But I think that there are some 744 00:40:11,560 --> 00:40:15,799 Speaker 1: good threads to start tugging at here, because it's probably 745 00:40:15,880 --> 00:40:18,960 Speaker 1: the case that there are more dimensions to the Uncanny 746 00:40:19,000 --> 00:40:22,560 Speaker 1: Valley than more he imagined in nineteen seventy, meaning more 747 00:40:22,600 --> 00:40:27,480 Speaker 1: than just that X axis of um closeness to realistic 748 00:40:27,520 --> 00:40:31,799 Speaker 1: human appearance versus distance from realistic human appearance. Yeah, I 749 00:40:31,800 --> 00:40:34,040 Speaker 1: mean just that what makes a person human, what makes 750 00:40:34,040 --> 00:40:37,240 Speaker 1: a lightness human. There's arguably a whole chorus of things 751 00:40:37,280 --> 00:40:40,279 Speaker 1: going on there. Yeah, so it would make sense that 752 00:40:40,280 --> 00:40:42,839 Speaker 1: that that chorus would play into the Uncanny Valley. Yeah. 753 00:40:42,840 --> 00:40:45,920 Speaker 1: So I do think that there are multiple other dimensions 754 00:40:45,960 --> 00:40:48,719 Speaker 1: to be explored. But I also don't think that means 755 00:40:48,719 --> 00:40:51,720 Speaker 1: we can conclude that there's nothing to the Uncanny Valley. 756 00:40:51,960 --> 00:40:54,440 Speaker 1: And in the past decade there's actually been an explosion 757 00:40:54,480 --> 00:40:56,680 Speaker 1: of research on the Uncanny Valley. So I think we 758 00:40:56,680 --> 00:40:59,280 Speaker 1: should look at a few interesting studies on the effect. 759 00:40:59,560 --> 00:41:02,960 Speaker 1: All right, well, uh, first one here that I came 760 00:41:03,000 --> 00:41:06,759 Speaker 1: across was a two thousand nine Princeton University study and 761 00:41:06,760 --> 00:41:09,560 Speaker 1: they looked into the effects of uncanny avowe of the 762 00:41:09,640 --> 00:41:15,040 Speaker 1: Uncanny Valley on macaque monkeys, so so non human subjects. Yeah, 763 00:41:15,080 --> 00:41:17,120 Speaker 1: because that that makes sense, right if you if you 764 00:41:17,120 --> 00:41:19,959 Speaker 1: want to see if this is an evolved response, let's 765 00:41:20,000 --> 00:41:23,480 Speaker 1: look beyond the complications of human intelligence and human culture 766 00:41:23,840 --> 00:41:26,919 Speaker 1: and looked something closely related to us. Is it biological 767 00:41:27,080 --> 00:41:30,319 Speaker 1: rather than say cultural? Right, And so they showed a 768 00:41:30,360 --> 00:41:34,640 Speaker 1: selection of the primates close to real quote unquote computer 769 00:41:34,760 --> 00:41:38,320 Speaker 1: visuals of macaques to see if they responded with coups 770 00:41:38,360 --> 00:41:41,800 Speaker 1: and lip smacking as they do with their fellow monkeys. Uh. 771 00:41:41,840 --> 00:41:46,120 Speaker 1: And these these close to real computer visuals were essentially 772 00:41:46,640 --> 00:41:50,520 Speaker 1: lawnmower man monkeys. If you see they kind of asking 773 00:41:50,640 --> 00:41:55,120 Speaker 1: if I've seen lawnmower man. You know I've seen lawn man. Yes, 774 00:41:55,400 --> 00:41:57,800 Speaker 1: so yeah, think a lot more man. Uh and you 775 00:41:58,160 --> 00:42:00,759 Speaker 1: kind of have an idea that that level of compute animation, 776 00:42:01,120 --> 00:42:03,480 Speaker 1: and the monkeys did not want any part of it, 777 00:42:03,520 --> 00:42:06,719 Speaker 1: and they averted their eyes, they acted frightened when confronted 778 00:42:06,719 --> 00:42:10,960 Speaker 1: with lawnmower man monkey. So it's not much, i admit, 779 00:42:11,400 --> 00:42:14,759 Speaker 1: but it's a little experimental evidence for the argument that 780 00:42:15,000 --> 00:42:18,279 Speaker 1: uncanny valley is an evolutionary response. Right, so if you 781 00:42:18,280 --> 00:42:21,640 Speaker 1: can observe it in monkeys, there's probably some element of 782 00:42:21,680 --> 00:42:25,600 Speaker 1: it that that is biological in the brain. It's instinctual 783 00:42:25,760 --> 00:42:28,839 Speaker 1: and not just something we've all learned to say about 784 00:42:28,920 --> 00:42:33,399 Speaker 1: weirdly looking animated characters and robots. And that would be 785 00:42:33,480 --> 00:42:36,640 Speaker 1: maybe a weak piece of evidence, but still a piece 786 00:42:36,680 --> 00:42:38,520 Speaker 1: of evidence you could put in the column of saying 787 00:42:38,600 --> 00:42:41,200 Speaker 1: there is something there. The valley does to some extent 788 00:42:41,280 --> 00:42:45,600 Speaker 1: exist now the next study that I ran across. This 789 00:42:45,640 --> 00:42:48,960 Speaker 1: comes back, this to one of the graphics that you 790 00:42:49,080 --> 00:42:52,680 Speaker 1: pulled out of the believe the original uh study correct? Yeah, yeah, 791 00:42:52,760 --> 00:42:56,560 Speaker 1: the original Morey's original graphs. So in this graph and 792 00:42:56,680 --> 00:42:58,920 Speaker 1: we talked about diving down into the valley and then 793 00:42:59,320 --> 00:43:01,960 Speaker 1: steadily try to claw yourself out on the other side, 794 00:43:02,480 --> 00:43:05,200 Speaker 1: very very steep ascent. Yeah, so you hit bottom and 795 00:43:05,239 --> 00:43:08,880 Speaker 1: that's where you have a zombie and as you begin 796 00:43:08,960 --> 00:43:12,160 Speaker 1: to scale out of the uncanny valley, he has um uh, 797 00:43:12,520 --> 00:43:16,400 Speaker 1: myoelectric hand and prosthetic hand down there. As you climb 798 00:43:16,400 --> 00:43:20,759 Speaker 1: back up, eventually hitting ordinary doll and and puppets and 799 00:43:20,880 --> 00:43:24,680 Speaker 1: ill person and maybe hitting healthy person at the very top. Again. 800 00:43:24,760 --> 00:43:27,200 Speaker 1: But it's interesting you have prosthetic hand down there, because 801 00:43:27,200 --> 00:43:31,640 Speaker 1: this next study looks at prosthetic and robotic and human hands. Yeah, 802 00:43:31,680 --> 00:43:33,799 Speaker 1: this is in the original study. More He talks about 803 00:43:33,920 --> 00:43:38,080 Speaker 1: the variable creepiness of prosthetic hands. And I found I 804 00:43:38,120 --> 00:43:39,799 Speaker 1: found this interesting because I don't know about you, but 805 00:43:39,800 --> 00:43:43,960 Speaker 1: but growing up I felt like crazy robot hands. Especially 806 00:43:44,000 --> 00:43:48,000 Speaker 1: we're everywhere like every G. I. Joe show or Heman 807 00:43:48,160 --> 00:43:52,439 Speaker 1: type franchise, there's always somebody it could be a villain, 808 00:43:52,480 --> 00:43:55,200 Speaker 1: it could be a hero. But there were crazy robot 809 00:43:55,239 --> 00:43:58,400 Speaker 1: hands galore, UH, and I always found them cool, and 810 00:43:58,920 --> 00:44:01,520 Speaker 1: I feel like a lot of us probably even fetishize 811 00:44:01,520 --> 00:44:03,799 Speaker 1: them to a certain point, Like we we didn't understand 812 00:44:04,040 --> 00:44:06,319 Speaker 1: what it would necessarily be like to lose and lose 813 00:44:06,360 --> 00:44:10,880 Speaker 1: a hand and the shortfall and the ability of technology 814 00:44:10,920 --> 00:44:14,720 Speaker 1: at the time and even today to replace that missing limb. 815 00:44:15,120 --> 00:44:18,000 Speaker 1: But we thought, well, that looks cool. Superpowered robot hands 816 00:44:18,040 --> 00:44:21,040 Speaker 1: signed me up, right. But back to the study two 817 00:44:21,040 --> 00:44:23,960 Speaker 1: thousand thirteen University of Manchester study, and they looked at 818 00:44:24,000 --> 00:44:29,120 Speaker 1: prosthetic hands. UH. They used of forty three right handed participants, 819 00:44:29,200 --> 00:44:31,920 Speaker 1: thirty six female and seven male, and they were all 820 00:44:31,960 --> 00:44:35,120 Speaker 1: looking at photos, and the photos were divided into three categories. 821 00:44:35,320 --> 00:44:38,719 Speaker 1: Human hands, robotic hands like, no question about it, that's 822 00:44:38,719 --> 00:44:41,600 Speaker 1: a robot hand I'm looking at like straight up terminator 823 00:44:41,600 --> 00:44:46,160 Speaker 1: exoskeleton or or or even less human, and then prosthetic hands. 824 00:44:46,880 --> 00:44:50,080 Speaker 1: The results, I have to say, reading through some some 825 00:44:50,160 --> 00:44:52,640 Speaker 1: of the writing about this UH and the original press release, 826 00:44:52,760 --> 00:44:57,080 Speaker 1: the results were kind of confusing sounding. They the subjects 827 00:44:57,080 --> 00:45:01,160 Speaker 1: here preferred human hands and robot hands, but but rated 828 00:45:01,400 --> 00:45:05,120 Speaker 1: and certainly rated prosthetic hands is more uncanny, but prosthetics 829 00:45:05,160 --> 00:45:10,520 Speaker 1: that looked more human were less eerie. Okay, So so 830 00:45:10,640 --> 00:45:13,600 Speaker 1: something that's clearly a robot that's not too creepy. Something's 831 00:45:13,600 --> 00:45:16,520 Speaker 1: clearly a human that's not too creepy. If something is 832 00:45:16,520 --> 00:45:19,760 Speaker 1: a robot trying to be human, that might be more creepy, 833 00:45:19,920 --> 00:45:23,839 Speaker 1: but as it gets better at being human, it's less creepy, 834 00:45:24,040 --> 00:45:25,719 Speaker 1: I think. So, I think that's my take. I mean, 835 00:45:25,800 --> 00:45:29,200 Speaker 1: it also makes me wonder if if the hand alone 836 00:45:29,920 --> 00:45:33,200 Speaker 1: is an is like a subset of the uncanny valet, 837 00:45:33,200 --> 00:45:36,080 Speaker 1: because certainly if you're if you're just working with a 838 00:45:36,160 --> 00:45:38,920 Speaker 1: hand and trying to replicate the movements, the look, the 839 00:45:39,000 --> 00:45:42,080 Speaker 1: feel of a human limb for an observer, not we're 840 00:45:42,080 --> 00:45:44,000 Speaker 1: not going to even get into the the you know, 841 00:45:44,040 --> 00:45:47,600 Speaker 1: the problems of creating something that the user can experience 842 00:45:47,880 --> 00:45:50,920 Speaker 1: as a lifelike limb. But if you're just looking at it, 843 00:45:50,920 --> 00:45:52,760 Speaker 1: if you don't have to worry about its eye contact, 844 00:45:52,800 --> 00:45:55,480 Speaker 1: you don't have to worry about micro expressions. Uh, it 845 00:45:55,840 --> 00:46:00,399 Speaker 1: seems like it would be an easier peak to surmount. Yeah, 846 00:46:00,400 --> 00:46:02,960 Speaker 1: so the if that is in fact the correct interpretation, 847 00:46:02,960 --> 00:46:06,320 Speaker 1: that would seem to undercut the steepness in Maury's original 848 00:46:06,400 --> 00:46:09,480 Speaker 1: graph right on the on the final peak. Yeah, that's 849 00:46:09,480 --> 00:46:13,080 Speaker 1: I mean, that's what I'm wondering. Because the hand had 850 00:46:13,080 --> 00:46:17,719 Speaker 1: taken in isolation, is thinking would be easier to replicate. Yeah. 851 00:46:18,719 --> 00:46:21,400 Speaker 1: Uh and Uncanny Valley. Let's face it, when we talk 852 00:46:21,480 --> 00:46:23,960 Speaker 1: about it, most of the time we're talking about faces. 853 00:46:24,080 --> 00:46:29,480 Speaker 1: Right now, Speaking of faces, there's another study. UM. This 854 00:46:29,840 --> 00:46:32,319 Speaker 1: is a two thousand and eleven University of California, San 855 00:46:32,360 --> 00:46:36,160 Speaker 1: Diego study. UM. This was published in the Social Cognitive 856 00:46:36,160 --> 00:46:39,200 Speaker 1: and Effective Neuroscience, and they did exactly what you'd expect 857 00:46:39,280 --> 00:46:42,600 Speaker 1: researchers to do when confronted with the Uncanny Valley. Grab 858 00:46:42,640 --> 00:46:45,000 Speaker 1: the f m r I and see what our brains 859 00:46:45,000 --> 00:46:46,960 Speaker 1: are doing when we're looking at all these images. So 860 00:46:47,040 --> 00:46:50,480 Speaker 1: all these fMRI I studies, all right, well, what what 861 00:46:50,560 --> 00:46:52,960 Speaker 1: did they find? All right, I'll roll through the basics 862 00:46:52,960 --> 00:46:55,360 Speaker 1: of the study here? So twenty subjects, not a not 863 00:46:55,400 --> 00:46:59,000 Speaker 1: a huge study here, aged thirty six. And here were 864 00:46:59,040 --> 00:47:02,759 Speaker 1: some of the the caveats they had in selecting these individuals. 865 00:47:02,960 --> 00:47:07,120 Speaker 1: No experience working with robots, no time spent in Japan, 866 00:47:07,520 --> 00:47:10,279 Speaker 1: no friends or family from Japan because they wanted to 867 00:47:10,320 --> 00:47:14,959 Speaker 1: avoid uh any you know, potential cultural exposure that would 868 00:47:14,960 --> 00:47:18,480 Speaker 1: have made them, would make them more accepting of androids. Okay, 869 00:47:18,480 --> 00:47:21,000 Speaker 1: So the idea is that maybe in Japan people just 870 00:47:21,520 --> 00:47:25,680 Speaker 1: experience humanoid robots way too much already there too, they're 871 00:47:26,280 --> 00:47:29,600 Speaker 1: acclimatized to them. Yeah, that that's the the argument they made, 872 00:47:29,600 --> 00:47:32,600 Speaker 1: and laying out the study, let's let's not even go there, 873 00:47:32,760 --> 00:47:35,360 Speaker 1: let's just deal with people who have less exposure to robots. 874 00:47:36,239 --> 00:47:39,200 Speaker 1: And they were shown twelve videos of a humanoid robot 875 00:47:39,480 --> 00:47:42,480 Speaker 1: named repley Q two. Oh man, I'm looking it up 876 00:47:42,560 --> 00:47:45,839 Speaker 1: right now. It's it's it's rough, but well, they watched 877 00:47:45,920 --> 00:47:49,200 Speaker 1: video twelve videos of this robot doing various things, and 878 00:47:49,239 --> 00:47:52,120 Speaker 1: they were shown videos of humans doing the same things. 879 00:47:52,320 --> 00:47:55,920 Speaker 1: And in fact, the robots movements and mannerisms were patterned 880 00:47:55,960 --> 00:47:59,359 Speaker 1: directly after the humans. So you had a you had 881 00:47:59,480 --> 00:48:02,480 Speaker 1: a human version of the actions, you had an android 882 00:48:02,600 --> 00:48:05,480 Speaker 1: version of the actions, uh, you know, a lifelike robot, 883 00:48:05,640 --> 00:48:08,080 Speaker 1: and then you had a a stripped down version of 884 00:48:08,120 --> 00:48:10,439 Speaker 1: the androids. So basically the android with all its skin 885 00:48:10,520 --> 00:48:13,280 Speaker 1: ripped off, so it looks more like a robot, clearly 886 00:48:13,320 --> 00:48:16,320 Speaker 1: a robot, and it's doing the same motions as well. 887 00:48:16,880 --> 00:48:19,560 Speaker 1: So this broke it all down to a human with 888 00:48:19,560 --> 00:48:22,840 Speaker 1: biological appearance in movement, a robot with mechanical appearance and 889 00:48:22,880 --> 00:48:26,720 Speaker 1: mechanical motion, and a human seeming agent with the exact 890 00:48:26,800 --> 00:48:32,200 Speaker 1: same mechanical movements as the robot. Then in came the 891 00:48:32,280 --> 00:48:36,040 Speaker 1: f M R I scans. So the main brain area 892 00:48:36,080 --> 00:48:38,520 Speaker 1: of note here, the the area that that that that 893 00:48:38,680 --> 00:48:42,480 Speaker 1: lit up where we saw the most activity, the parietal 894 00:48:42,520 --> 00:48:46,799 Speaker 1: cortex on both sides of the brain, specifically in the 895 00:48:46,840 --> 00:48:49,960 Speaker 1: areas that connect the part of the brain's visual cortex 896 00:48:50,000 --> 00:48:53,360 Speaker 1: that process bodily movements with the section of the motor 897 00:48:53,440 --> 00:48:57,719 Speaker 1: cortex thought to contain mirror neurons. So those would be 898 00:48:57,719 --> 00:49:01,360 Speaker 1: like the the empathy parts of the brain where you know, 899 00:49:01,440 --> 00:49:04,440 Speaker 1: we we see something going on in some other creature 900 00:49:04,520 --> 00:49:07,359 Speaker 1: like us, and we empathize with it. Exactly. Yeah, So 901 00:49:07,480 --> 00:49:10,759 Speaker 1: when viewing the human looking android, the brain lit up 902 00:49:10,800 --> 00:49:15,000 Speaker 1: at the recognition of a human form, but registered essentially 903 00:49:15,280 --> 00:49:18,359 Speaker 1: a computing error over the movement. Something didn't match up. 904 00:49:18,960 --> 00:49:21,320 Speaker 1: Uh so it's it's not. According to this study, it 905 00:49:21,360 --> 00:49:23,480 Speaker 1: would seem that it's not the biological movement or the 906 00:49:23,480 --> 00:49:28,080 Speaker 1: biological appearance, it's the congruents or lack of congruents between 907 00:49:28,120 --> 00:49:30,560 Speaker 1: the two. You look alive but you're dead, you look 908 00:49:30,640 --> 00:49:32,880 Speaker 1: dead but you move you or you speak as if 909 00:49:32,880 --> 00:49:36,239 Speaker 1: you're alive. Uh. So the researchers noted that this is 910 00:49:36,280 --> 00:49:41,200 Speaker 1: something that could be retuned through exposure, but it could 911 00:49:41,239 --> 00:49:43,920 Speaker 1: be at the heart of what's going on with the 912 00:49:44,000 --> 00:49:46,960 Speaker 1: Uncanny Valley. Interesting. Well, I think we should look at 913 00:49:47,000 --> 00:49:50,799 Speaker 1: one more study, uh, potentially providing recent support for the 914 00:49:50,800 --> 00:49:53,520 Speaker 1: existence of the Uncanny Valley, and then maybe after that 915 00:49:53,560 --> 00:49:55,640 Speaker 1: we should break and then come back next time to 916 00:49:55,880 --> 00:49:58,600 Speaker 1: get into the causes, what what would be causing this 917 00:49:58,680 --> 00:50:02,239 Speaker 1: effect and uh and future. So I want to look 918 00:50:02,280 --> 00:50:05,160 Speaker 1: at a study that came out in two six in 919 00:50:05,200 --> 00:50:09,080 Speaker 1: the journal Cognition by Mather and Rifling called Navigating a 920 00:50:09,120 --> 00:50:13,239 Speaker 1: Social World with Robot Partners. A Quantitative cartography of the 921 00:50:13,320 --> 00:50:17,880 Speaker 1: Uncanny Valley. Cute invocation of map making there because it 922 00:50:17,920 --> 00:50:19,640 Speaker 1: does kind of make sense. I like the idea of 923 00:50:19,680 --> 00:50:23,400 Speaker 1: mapping the valley because that indicates that it may expand 924 00:50:23,480 --> 00:50:27,040 Speaker 1: beyond just the one dimensional dip and is in fact 925 00:50:27,080 --> 00:50:30,480 Speaker 1: more of a topographical space, you know, like we can 926 00:50:30,520 --> 00:50:34,120 Speaker 1: extend into three dimensions. But anyway, So to get into 927 00:50:34,120 --> 00:50:36,880 Speaker 1: the study, the author's note that while the Uncanny Valley 928 00:50:36,920 --> 00:50:39,640 Speaker 1: has very strong intuitive support, people tend to take it 929 00:50:39,680 --> 00:50:43,840 Speaker 1: as fact, experimental evidence for it has been limited and inconsistent. 930 00:50:43,920 --> 00:50:47,120 Speaker 1: As as we mentioned earlier, some studies seem to find 931 00:50:47,160 --> 00:50:50,560 Speaker 1: evidence for the valley, others don't you know, they say this, 932 00:50:50,560 --> 00:50:55,680 Speaker 1: This isn't necessarily a thing. So there are multiple experiments here. First, 933 00:50:55,760 --> 00:50:57,560 Speaker 1: they did a thing that I think was pretty smart. 934 00:50:57,640 --> 00:51:01,319 Speaker 1: If they were trying to chart a linear progression of 935 00:51:01,400 --> 00:51:04,000 Speaker 1: the up and down peaks and valleys, they tried to 936 00:51:04,080 --> 00:51:09,000 Speaker 1: generate an objectively determined gradient of more and less human 937 00:51:09,080 --> 00:51:11,719 Speaker 1: looking robots. So what a lot of these studies do 938 00:51:11,880 --> 00:51:15,640 Speaker 1: is maybe along the macaques study ideas, they show you 939 00:51:15,680 --> 00:51:18,520 Speaker 1: a lawnmower man, they show you a real person, they 940 00:51:18,520 --> 00:51:21,279 Speaker 1: show you a robot, uh, and they ask you to 941 00:51:21,440 --> 00:51:23,880 Speaker 1: characterize you know, how do you feel about these? What 942 00:51:24,000 --> 00:51:26,960 Speaker 1: they did here is that they gathered a very large 943 00:51:27,000 --> 00:51:30,680 Speaker 1: sample or relatively large sample of eighty images quote from 944 00:51:30,760 --> 00:51:34,720 Speaker 1: the wild meaning from the Internet. So these wild type 945 00:51:34,840 --> 00:51:37,279 Speaker 1: robots samples, and they had a bunch of inclusion and 946 00:51:37,360 --> 00:51:39,799 Speaker 1: exclusion criteria. I don't want to get into all of them, 947 00:51:39,800 --> 00:51:42,680 Speaker 1: but they tried to limit it to where it would 948 00:51:42,760 --> 00:51:45,600 Speaker 1: it would kind of throw out all these variables they 949 00:51:45,600 --> 00:51:48,080 Speaker 1: could complicate things like they tried to keep just certain 950 00:51:48,160 --> 00:51:52,120 Speaker 1: types of pictures of faces of real robots that are 951 00:51:52,160 --> 00:51:56,440 Speaker 1: built and uh and they had some exclusion criteria like 952 00:51:56,520 --> 00:51:58,920 Speaker 1: it couldn't be a well known character of a famous person. 953 00:51:59,760 --> 00:52:02,759 Speaker 1: Uh um, it couldn't have objects overlapping the face, It 954 00:52:02,800 --> 00:52:05,600 Speaker 1: couldn't be a toy, it had to be a real 955 00:52:05,719 --> 00:52:08,960 Speaker 1: humanoid robot. And then they had subjects rate these images 956 00:52:09,480 --> 00:52:12,920 Speaker 1: on what they call the mechano Humanoid scale, basically to 957 00:52:13,000 --> 00:52:16,880 Speaker 1: come up with an objectively derived score for each image 958 00:52:16,920 --> 00:52:19,719 Speaker 1: by using this this empirical research, by going to a 959 00:52:19,719 --> 00:52:22,200 Speaker 1: bunch of people and saying, hey, how mechanical is this? 960 00:52:22,520 --> 00:52:26,239 Speaker 1: How human is this? And then after they had a 961 00:52:26,360 --> 00:52:28,680 Speaker 1: rating for each of these eight images, and Robert have 962 00:52:28,800 --> 00:52:32,239 Speaker 1: included an image, uh, I think down here to show you, 963 00:52:32,280 --> 00:52:35,000 Speaker 1: like what all these robots where you can kind of see. 964 00:52:35,040 --> 00:52:38,239 Speaker 1: It starts with things that look not human at all, 965 00:52:38,440 --> 00:52:41,520 Speaker 1: just like a lump of wires and junk, and then 966 00:52:41,560 --> 00:52:44,160 Speaker 1: it proceeds up to something that looks like a picture 967 00:52:44,160 --> 00:52:47,439 Speaker 1: of a guy. Yes, yeah, very much. Though you start 968 00:52:47,480 --> 00:52:52,680 Speaker 1: off with very kind of wallyesque heads. Then you move 969 00:52:52,719 --> 00:52:57,480 Speaker 1: in through like like skinless gremlins, and then through the 970 00:52:57,520 --> 00:53:02,240 Speaker 1: sort of the the expected hierarchy of umanoid robots. Okay, 971 00:53:02,280 --> 00:53:05,080 Speaker 1: so they've got this thing, and then they rate all 972 00:53:05,120 --> 00:53:09,080 Speaker 1: these images and sort them into an ascending scale of humanness. 973 00:53:09,800 --> 00:53:12,279 Speaker 1: And then they took ratings in multiple different ways of 974 00:53:12,360 --> 00:53:16,640 Speaker 1: likability and trustworthiness. Now, in likability, they claimed to find 975 00:53:16,680 --> 00:53:21,520 Speaker 1: a robust uncanny Valley effect, where likability increased linearly with 976 00:53:21,760 --> 00:53:24,799 Speaker 1: humanoid qualities up to a certain point, and then it 977 00:53:24,840 --> 00:53:28,440 Speaker 1: took a negative dip as the humanoid qualities continued to 978 00:53:28,560 --> 00:53:31,600 Speaker 1: increase past that point, and then once again began to 979 00:53:31,760 --> 00:53:34,560 Speaker 1: rise at the far end of the scale. Now, one 980 00:53:34,560 --> 00:53:36,320 Speaker 1: thing I want to say, just looking at the results 981 00:53:36,400 --> 00:53:39,240 Speaker 1: is it does not appear that people were the most 982 00:53:39,440 --> 00:53:42,400 Speaker 1: bothered by the things that were the most human looking. 983 00:53:43,000 --> 00:53:45,799 Speaker 1: Like given my understanding of the uncanny valley, I would 984 00:53:45,800 --> 00:53:48,120 Speaker 1: have expected the stuff at the very top end of 985 00:53:48,120 --> 00:53:51,440 Speaker 1: the scale to be the most disturbing. But they actually 986 00:53:51,560 --> 00:53:53,799 Speaker 1: kind of liked the stuff at the very top end 987 00:53:53,800 --> 00:53:56,760 Speaker 1: of the scale. It was somewhere closer to the upper 988 00:53:56,920 --> 00:53:59,840 Speaker 1: half middle of the scale that they really didn't like. 989 00:54:00,800 --> 00:54:03,680 Speaker 1: Um So, to whatever extent, there is a real uncanny Valley, 990 00:54:03,719 --> 00:54:06,720 Speaker 1: it might not lie so close to the quote realism 991 00:54:06,760 --> 00:54:10,000 Speaker 1: into the spectrum as we think. They also performed some 992 00:54:10,120 --> 00:54:13,320 Speaker 1: trust experiments by creating a scenario where subjects would be 993 00:54:13,360 --> 00:54:16,319 Speaker 1: asked to trust these robots to invest money for them, 994 00:54:17,080 --> 00:54:19,920 Speaker 1: and the results there were basically they claimed that the 995 00:54:20,000 --> 00:54:24,600 Speaker 1: trust uh experiments did show some Uncanny Valley effects, but 996 00:54:24,640 --> 00:54:27,080 Speaker 1: the results were a little more complicated than on the 997 00:54:27,120 --> 00:54:32,200 Speaker 1: straightforward superficial likability scale, the likability really did look like 998 00:54:32,440 --> 00:54:37,120 Speaker 1: Uncanny Valley was being displayed. They also performed experiments with 999 00:54:37,520 --> 00:54:41,600 Speaker 1: a more traditional quote controlled series of composed face images, 1000 00:54:41,680 --> 00:54:44,720 Speaker 1: so would just be a series of basically the same 1001 00:54:44,760 --> 00:54:47,799 Speaker 1: face as a robot than a little bit more human, 1002 00:54:47,840 --> 00:54:50,239 Speaker 1: a little bit more human, little bit more human on 1003 00:54:50,280 --> 00:54:53,680 Speaker 1: this gradient of human nous. And they generally claimed to 1004 00:54:53,719 --> 00:54:56,360 Speaker 1: find that there was evidence for the Uncanny Valley effect 1005 00:54:56,360 --> 00:55:00,520 Speaker 1: in both likability and trust with both the why caught 1006 00:55:00,680 --> 00:55:04,799 Speaker 1: robot image samples and with these composed face images that 1007 00:55:04,840 --> 00:55:08,080 Speaker 1: they came up with. But as always, more studies are needed. 1008 00:55:08,080 --> 00:55:12,240 Speaker 1: But that looks like there is one study showing pretty 1009 00:55:12,280 --> 00:55:16,839 Speaker 1: solid evidence that there is something like an uncanny Valley effect. Yeah, 1010 00:55:16,840 --> 00:55:19,399 Speaker 1: and I like the idea that that that that it's 1011 00:55:19,600 --> 00:55:22,319 Speaker 1: it's an uncanny valley, but maybe it's just a more 1012 00:55:22,520 --> 00:55:27,200 Speaker 1: more nuanced from a a topographical standpoint. You know, they're 1013 00:55:27,200 --> 00:55:30,399 Speaker 1: they're more a little little bumps and little valleys within 1014 00:55:30,920 --> 00:55:34,239 Speaker 1: the overall valley, little caves you can crawl into and 1015 00:55:34,640 --> 00:55:37,839 Speaker 1: just yourself inside, and maybe even caves that turn into 1016 00:55:37,880 --> 00:55:40,480 Speaker 1: tunnels that emerge on the other side. Yeah, that that's 1017 00:55:40,520 --> 00:55:42,400 Speaker 1: an interesting thing. I mean, like they point out that 1018 00:55:42,400 --> 00:55:44,799 Speaker 1: there's a lot of variability in their data. Actually, like 1019 00:55:44,840 --> 00:55:48,440 Speaker 1: it wasn't um If you look at their their plot 1020 00:55:48,520 --> 00:55:51,239 Speaker 1: chart of where all the data points fall and then 1021 00:55:51,280 --> 00:55:53,280 Speaker 1: they plot a line going through it. If you plot 1022 00:55:53,320 --> 00:55:55,399 Speaker 1: a line going through all their data, it does show 1023 00:55:55,400 --> 00:55:58,319 Speaker 1: the uncanny valley effect. But you know, there there are 1024 00:55:58,360 --> 00:56:01,960 Speaker 1: outliers all over the place, like there is some there 1025 00:56:02,239 --> 00:56:05,839 Speaker 1: are some robots that are just consistently more like more 1026 00:56:05,880 --> 00:56:08,279 Speaker 1: than the other ones. I find it interestingly that the 1027 00:56:08,560 --> 00:56:10,759 Speaker 1: some of the higher rated ones, or at least uh 1028 00:56:10,840 --> 00:56:14,319 Speaker 1: I think what number seventy nine in particular, kind of 1029 00:56:14,360 --> 00:56:19,160 Speaker 1: looks like a generic human as opposed to say, go 1030 00:56:19,239 --> 00:56:22,120 Speaker 1: down to seventy four that looks like a very specific 1031 00:56:22,200 --> 00:56:24,799 Speaker 1: human like if I had to pick him or pick 1032 00:56:24,840 --> 00:56:29,920 Speaker 1: the human he's patterned after, assumingly out of a police lineup, 1033 00:56:30,000 --> 00:56:31,600 Speaker 1: I feel like I'd be able to do it. But 1034 00:56:31,680 --> 00:56:34,480 Speaker 1: also seventy four looks angry. I'm sorry, folks, you can't 1035 00:56:34,480 --> 00:56:37,319 Speaker 1: see what we're talking about, but it's frowning at you, 1036 00:56:37,719 --> 00:56:40,680 Speaker 1: kind of like should I kill all humans or just 1037 00:56:41,120 --> 00:56:43,960 Speaker 1: shrug it off? And maybe two day's the day that 1038 00:56:44,000 --> 00:56:47,239 Speaker 1: does introduce There are a lot of complicating factors here, 1039 00:56:47,239 --> 00:56:50,040 Speaker 1: and the authors acknowledged this, like, these images don't all 1040 00:56:50,080 --> 00:56:53,279 Speaker 1: have necessarily the same emotional affect, like some of them 1041 00:56:53,320 --> 00:56:57,160 Speaker 1: seem happy, some seem unhappy. There's enough variability across the 1042 00:56:57,200 --> 00:57:01,120 Speaker 1: board that you can think you're getting a reasonably decent 1043 00:57:01,280 --> 00:57:05,800 Speaker 1: answer when you plot reactions across all samples. But yeah, 1044 00:57:05,840 --> 00:57:08,799 Speaker 1: there's definitely a lot of different stuff going on here 1045 00:57:09,120 --> 00:57:12,880 Speaker 1: in addition to just being more or less human. I 1046 00:57:12,960 --> 00:57:15,759 Speaker 1: like how thirty four on our on our chart here 1047 00:57:16,040 --> 00:57:20,000 Speaker 1: it seems to rely heavily on animated mustache and eyebrows. 1048 00:57:20,120 --> 00:57:22,200 Speaker 1: Oh yeah, what is that? It looks like a It 1049 00:57:22,240 --> 00:57:25,320 Speaker 1: looks like a very mustache. I can't add to what 1050 00:57:25,360 --> 00:57:28,280 Speaker 1: you've just said. It's got a white mustache and brow 1051 00:57:28,400 --> 00:57:31,520 Speaker 1: and beard, and it's saying by it looks like a 1052 00:57:31,560 --> 00:57:34,320 Speaker 1: lot of these incomplete puppets are stripped away puppets, you 1053 00:57:34,320 --> 00:57:36,120 Speaker 1: see where they're like, all right, we got a lot 1054 00:57:36,120 --> 00:57:37,680 Speaker 1: of work to do on this thing, but at least 1055 00:57:37,680 --> 00:57:39,880 Speaker 1: we got the eyebrows in a mustache in place. But see, 1056 00:57:39,920 --> 00:57:42,200 Speaker 1: I find that one very likable. It doesn't look very 1057 00:57:42,240 --> 00:57:44,840 Speaker 1: human at all, but it's very I want to play 1058 00:57:44,880 --> 00:57:47,080 Speaker 1: with it. Yeah, okay, Robert, Well, we've got a bunch 1059 00:57:47,080 --> 00:57:48,400 Speaker 1: of more stuff to talk about, but I think we 1060 00:57:48,440 --> 00:57:50,760 Speaker 1: should call it there and come back and finish our 1061 00:57:50,760 --> 00:57:53,760 Speaker 1: discussion of the Uncanny Valley next time. Yeah, we'll get 1062 00:57:53,760 --> 00:57:56,400 Speaker 1: into we'll go beyond the Uncanny Valley. Yeah, so we'll 1063 00:57:56,480 --> 00:57:59,640 Speaker 1: we'll talk about what might cause the Uncanny Valley effect 1064 00:57:59,640 --> 00:58:02,080 Speaker 1: to what it were extent it does exist, and we 1065 00:58:02,120 --> 00:58:05,480 Speaker 1: can talk about you know, what happens when you ascend 1066 00:58:05,600 --> 00:58:09,200 Speaker 1: that that far slow? All right? Well, hey, in the meantime, 1067 00:58:09,400 --> 00:58:11,480 Speaker 1: head on over to stuff to Blow your Mind dot com. 1068 00:58:11,560 --> 00:58:14,000 Speaker 1: That is where you will find all the podcast episodes. 1069 00:58:14,040 --> 00:58:17,360 Speaker 1: You'll find videos, blog posts, as well as links out 1070 00:58:17,400 --> 00:58:19,880 Speaker 1: to our various social media accounts, and the landing page 1071 00:58:19,880 --> 00:58:22,600 Speaker 1: for this episode should include some links to some of 1072 00:58:22,600 --> 00:58:25,440 Speaker 1: the resources we're talking about here today. And if you 1073 00:58:25,440 --> 00:58:27,520 Speaker 1: want to get in touch with us, as always with 1074 00:58:27,600 --> 00:58:30,120 Speaker 1: feedback on this episode or any other, or you just 1075 00:58:30,120 --> 00:58:31,880 Speaker 1: want to say hi, or you want to let us 1076 00:58:31,920 --> 00:58:34,040 Speaker 1: know an episode topic you'd like us to cover in 1077 00:58:34,040 --> 00:58:36,720 Speaker 1: the future, you can email us at blow the Mind 1078 00:58:36,760 --> 00:58:48,360 Speaker 1: at how stuff works dot com for more on this 1079 00:58:48,560 --> 00:58:51,080 Speaker 1: and thousands of other topics. Is it how stuff works 1080 00:58:51,080 --> 00:59:12,040 Speaker 1: dot Commentary five five five point part proper First Far 1081 00:59:11,240 --> 00:59:11,400 Speaker 1: Far