1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:15,080 Speaker 1: Be there and welcome to tech Stuff. I'm your host, 3 00:00:15,360 --> 00:00:19,520 Speaker 1: Johnathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:19,680 --> 00:00:23,159 Speaker 1: how the tech are you? And it's time for a 5 00:00:23,360 --> 00:00:27,520 Speaker 1: text Stuff Tadbits episode. I hope you're ready for another 6 00:00:27,920 --> 00:00:32,519 Speaker 1: spooky edition of tech Stuff for those of you listening 7 00:00:32,520 --> 00:00:37,480 Speaker 1: from the future. I've been doing a few tangentially spooky 8 00:00:37,560 --> 00:00:41,160 Speaker 1: episodes for the month of October and two thousand twenty two, 9 00:00:41,159 --> 00:00:44,520 Speaker 1: and today I thought we would talk about the concept 10 00:00:45,040 --> 00:00:49,360 Speaker 1: of the Uncanny Valley, and we're gonna look at sort 11 00:00:49,400 --> 00:00:52,760 Speaker 1: of the hypothesis of the Uncanny Valley, as well as 12 00:00:52,800 --> 00:00:57,960 Speaker 1: some criticisms and counter arguments related to the idea. Not 13 00:00:58,160 --> 00:01:02,040 Speaker 1: that the uncanny isn't something that we experience, at least 14 00:01:02,040 --> 00:01:06,040 Speaker 1: that feeling of uneasiness we experience when we encounter something 15 00:01:06,120 --> 00:01:08,840 Speaker 1: that's said to be in the Uncanny Valley, but rather 16 00:01:09,560 --> 00:01:12,920 Speaker 1: the actual concept of the valley itself, whether or not 17 00:01:13,040 --> 00:01:17,280 Speaker 1: that's a valid idea. Before we can do any of that, 18 00:01:17,800 --> 00:01:20,760 Speaker 1: we need to talk about what the uncanny Valley is, 19 00:01:21,000 --> 00:01:23,800 Speaker 1: or what it's supposed to be, at least in general terms. 20 00:01:24,720 --> 00:01:29,280 Speaker 1: The phrase comes from roboticist Massa Hiro Mori. He has 21 00:01:29,319 --> 00:01:33,039 Speaker 1: made numerous contributions in the fields of robotics and in 22 00:01:33,200 --> 00:01:39,160 Speaker 1: human robot interactions and responses, including human emotional response to robots, 23 00:01:39,200 --> 00:01:42,200 Speaker 1: which is really what happens when you get to the 24 00:01:42,240 --> 00:01:45,720 Speaker 1: heart of Uncanny Valley. So way back in nineteen seventy, 25 00:01:46,200 --> 00:01:50,160 Speaker 1: More penned an article for a Japanese journal called Energy, 26 00:01:50,280 --> 00:01:52,680 Speaker 1: and it was in this article that we got the 27 00:01:52,680 --> 00:01:59,800 Speaker 1: phrase uncanny Valley. So Maury's observation was this, robots tend 28 00:01:59,840 --> 00:02:03,760 Speaker 1: to get more appealing as they start to look more 29 00:02:03,840 --> 00:02:08,440 Speaker 1: human like, but only up to a certain point. Once 30 00:02:08,480 --> 00:02:13,520 Speaker 1: you go past that point, they become incredibly unappealing. You know. 31 00:02:14,120 --> 00:02:15,920 Speaker 1: It could be to the point where it's not just 32 00:02:15,960 --> 00:02:18,040 Speaker 1: that you have no emotional response, but you have a 33 00:02:18,120 --> 00:02:23,440 Speaker 1: negative emotional response upon encountering the robot. It becomes creepy 34 00:02:23,600 --> 00:02:28,760 Speaker 1: and disturbing. We have entire horror movies that are based 35 00:02:28,800 --> 00:02:34,000 Speaker 1: off this. A very current example is the film Meagan 36 00:02:34,440 --> 00:02:37,680 Speaker 1: m Three g a n You may have seen the 37 00:02:37,880 --> 00:02:41,520 Speaker 1: trailer or maybe an animated gift from it where you 38 00:02:41,560 --> 00:02:45,440 Speaker 1: see a child sized robot like it looks like a 39 00:02:45,520 --> 00:02:50,839 Speaker 1: robot little girl, but with artificial eyes doing this this 40 00:02:51,120 --> 00:02:53,520 Speaker 1: dance in a hallway, and a lot of people have 41 00:02:53,560 --> 00:02:57,720 Speaker 1: talked about how it's kind of nightmare inducing. That plays 42 00:02:57,800 --> 00:03:01,880 Speaker 1: on this concept of uncanny vale that when you reach 43 00:03:01,880 --> 00:03:05,240 Speaker 1: a level that looks pretty human but at the same 44 00:03:05,240 --> 00:03:08,440 Speaker 1: time as distinctly not human. There's something about it that 45 00:03:08,600 --> 00:03:12,520 Speaker 1: is very much not human, and that's when it really 46 00:03:12,560 --> 00:03:16,040 Speaker 1: turns you off. And then as you get closer to 47 00:03:16,120 --> 00:03:19,040 Speaker 1: human again, once you get past this point, the emotional 48 00:03:19,080 --> 00:03:23,359 Speaker 1: response improves. So in other words, you've got this continuum 49 00:03:23,400 --> 00:03:26,800 Speaker 1: from not human at all, which is usually matched with 50 00:03:27,120 --> 00:03:31,920 Speaker 1: little to no emotional response, to gradually becoming more and 51 00:03:31,960 --> 00:03:35,600 Speaker 1: more human like with an improved emotional response, to getting 52 00:03:35,680 --> 00:03:38,839 Speaker 1: almost but not quite humanlike, where you have a plunge 53 00:03:39,440 --> 00:03:43,360 Speaker 1: in the emotional response, and then you get past this 54 00:03:43,440 --> 00:03:47,280 Speaker 1: point and you get to the improvement again. So this 55 00:03:47,400 --> 00:03:50,640 Speaker 1: is that that dip there, that's the uncanny valley. It's 56 00:03:50,640 --> 00:03:55,680 Speaker 1: when robots appear to be strange and unsettling or uncanny 57 00:03:55,800 --> 00:04:00,960 Speaker 1: as it were. Uh. This concept, of course extends beyond robotics. 58 00:04:01,360 --> 00:04:04,840 Speaker 1: We also use it for stuff like computer generated characters. 59 00:04:05,680 --> 00:04:08,920 Speaker 1: Remember way back in two thousand four with the film 60 00:04:09,000 --> 00:04:13,120 Speaker 1: The Polar Express came out. If you're not familiar with 61 00:04:13,160 --> 00:04:16,760 Speaker 1: the Polar Express, it's a computer animated film. Tom Hanks 62 00:04:17,240 --> 00:04:19,159 Speaker 1: did the voice for it and has a character that 63 00:04:19,240 --> 00:04:22,360 Speaker 1: looks like a creepy version of Tom Hanks in it, 64 00:04:22,880 --> 00:04:27,560 Speaker 1: and the animators used live action references, including motion capture 65 00:04:27,640 --> 00:04:31,599 Speaker 1: performances to kind of serve as the underlying animation for 66 00:04:31,720 --> 00:04:36,160 Speaker 1: the characters in the film. Several reviewers praised the story, 67 00:04:36,680 --> 00:04:40,360 Speaker 1: but a lot of them both positive and negative reviews, 68 00:04:40,400 --> 00:04:44,640 Speaker 1: commented that the human characters in the film seemed off 69 00:04:44,880 --> 00:04:47,800 Speaker 1: and a little bit creepy as a result, that it 70 00:04:47,839 --> 00:04:51,320 Speaker 1: was kind of unsettling, especially to see their eyes move 71 00:04:51,920 --> 00:04:57,960 Speaker 1: now right away with this approach to an uncanny valley idea, 72 00:04:58,080 --> 00:05:02,120 Speaker 1: there is a bit of a problem of this general definition, 73 00:05:02,560 --> 00:05:05,440 Speaker 1: and that's the fact that, you know, robots or computer 74 00:05:05,520 --> 00:05:10,760 Speaker 1: generated characters appearances aren't really designed to be on a 75 00:05:10,800 --> 00:05:15,640 Speaker 1: gradient like this, Like that was not necessarily the prime consideration. 76 00:05:15,800 --> 00:05:20,960 Speaker 1: So if all artificial constructs, like if all computer animated 77 00:05:21,040 --> 00:05:25,200 Speaker 1: characters and all robots ultimately were meant to have an 78 00:05:25,240 --> 00:05:29,680 Speaker 1: appearance that was indistinguishable from a human or at least 79 00:05:29,680 --> 00:05:32,640 Speaker 1: to look human, with you know, that being the end 80 00:05:32,640 --> 00:05:34,080 Speaker 1: goal is to make it so that you couldn't tell 81 00:05:34,120 --> 00:05:38,919 Speaker 1: the difference between artificial and real. Then we would have 82 00:05:38,960 --> 00:05:42,680 Speaker 1: a pretty good basis for our thesis about the Uncanny Valley, right, 83 00:05:42,760 --> 00:05:47,920 Speaker 1: because you would have this purposeful gradient there. But instead, 84 00:05:48,560 --> 00:05:51,880 Speaker 1: you know, there are robots that are never meant to 85 00:05:52,000 --> 00:05:55,760 Speaker 1: look human in appearance. That was never a consideration because 86 00:05:55,800 --> 00:05:58,200 Speaker 1: it wasn't necessary for what the job that the robot 87 00:05:58,240 --> 00:06:01,880 Speaker 1: was going to do. So it's a non factor. Right. 88 00:06:01,960 --> 00:06:05,080 Speaker 1: The robots were designed to do something like to solve 89 00:06:05,160 --> 00:06:07,719 Speaker 1: some sort of problem or to do some sort of task, 90 00:06:08,520 --> 00:06:13,760 Speaker 1: and unless that happens to include to the the requirement 91 00:06:13,800 --> 00:06:16,279 Speaker 1: to look like a human at the same time, there's 92 00:06:16,320 --> 00:06:20,080 Speaker 1: no reason to make that part of the design process. Heck, 93 00:06:20,120 --> 00:06:24,080 Speaker 1: we've seen this with various robot designs recently, where a 94 00:06:24,160 --> 00:06:27,919 Speaker 1: lot of companies were looking at using robots with wheels 95 00:06:28,480 --> 00:06:32,479 Speaker 1: instead of robots with legs, because legs are way more 96 00:06:32,560 --> 00:06:35,640 Speaker 1: complicated to work out than wheels. You know, being able 97 00:06:35,680 --> 00:06:39,760 Speaker 1: to make sure that the robot can balance and can 98 00:06:39,800 --> 00:06:44,080 Speaker 1: move around without falling over these are non trivial challenges. 99 00:06:44,680 --> 00:06:47,720 Speaker 1: So if there's no reason for the robot to have 100 00:06:47,920 --> 00:06:52,720 Speaker 1: those those features, you tell build it into the robot. Right, 101 00:06:53,000 --> 00:06:55,080 Speaker 1: If the robot doesn't need to move around at all, 102 00:06:55,120 --> 00:06:57,839 Speaker 1: then it doesn't have legs or wheels or anything. It's stationary. 103 00:06:58,480 --> 00:07:02,159 Speaker 1: So you know, a robot that's meant to quickly weld 104 00:07:02,200 --> 00:07:04,880 Speaker 1: a dozen spots on a car chassis can look like 105 00:07:04,880 --> 00:07:07,719 Speaker 1: a big old industrial arm that has a few points 106 00:07:07,720 --> 00:07:11,560 Speaker 1: of articulation, or you do. You might have a robot 107 00:07:11,600 --> 00:07:13,920 Speaker 1: that's just like this big stationary block that to you 108 00:07:14,040 --> 00:07:16,200 Speaker 1: doesn't look like robot at all. Right, it doesn't have 109 00:07:16,240 --> 00:07:20,280 Speaker 1: any features on it that screamed to you robot, but 110 00:07:20,400 --> 00:07:25,720 Speaker 1: it is in fact an automated, repetitive machine designed to 111 00:07:25,720 --> 00:07:29,000 Speaker 1: to do some task on its own. More or less. 112 00:07:29,600 --> 00:07:32,280 Speaker 1: You might call it a robot by those definitions, but 113 00:07:32,320 --> 00:07:34,240 Speaker 1: it doesn't look like what we think of when we 114 00:07:34,280 --> 00:07:38,440 Speaker 1: hear the word robot. They may just be another element 115 00:07:38,480 --> 00:07:42,840 Speaker 1: along an assembly line that helps produce some manufactured component, 116 00:07:43,480 --> 00:07:47,000 Speaker 1: so they don't look human. But they also don't creep 117 00:07:47,080 --> 00:07:50,800 Speaker 1: us out right. We don't look at these giant industrial 118 00:07:50,920 --> 00:07:56,880 Speaker 1: robots and feel them like having an unsettling uh influence 119 00:07:56,920 --> 00:08:00,880 Speaker 1: on us. We have a non emotional response to most 120 00:08:00,880 --> 00:08:04,120 Speaker 1: of them. So it's just a it's just a thing. 121 00:08:04,160 --> 00:08:06,120 Speaker 1: It's just like a tool. It's like if you saw 122 00:08:06,160 --> 00:08:09,840 Speaker 1: a hammer or a screwdriver, you're not likely to have 123 00:08:09,920 --> 00:08:15,480 Speaker 1: some sort of emotional response to that. So maybe instead 124 00:08:15,680 --> 00:08:19,640 Speaker 1: of just grouping all robots together and saying they fall 125 00:08:19,680 --> 00:08:23,080 Speaker 1: along this uncanny valley, we should really focus on a 126 00:08:23,120 --> 00:08:28,360 Speaker 1: subset of robots, ones that incorporate elements of human like appearance. 127 00:08:28,960 --> 00:08:31,840 Speaker 1: So that would count out stuff like room buzz. You 128 00:08:31,840 --> 00:08:35,320 Speaker 1: wouldn't put your roomba in that category, right, Your industrial 129 00:08:35,440 --> 00:08:38,680 Speaker 1: robots also wouldn't go in there. Um And I'm not saying, 130 00:08:38,720 --> 00:08:41,400 Speaker 1: by the way, that your robotic vacuum cleaner isn't cute. 131 00:08:41,400 --> 00:08:43,520 Speaker 1: I'm just saying that for the purposes of an uncanny 132 00:08:43,640 --> 00:08:49,400 Speaker 1: valley narrative or conversation, it doesn't really fit. But let's 133 00:08:49,400 --> 00:08:53,920 Speaker 1: get back to Masahiro Mori. He created a pretty simple graph, 134 00:08:53,960 --> 00:08:57,520 Speaker 1: a line chart with an X axis and a y axis. 135 00:08:57,559 --> 00:09:01,800 Speaker 1: So the X axis the horizontal axis he wrote human likeness, 136 00:09:01,840 --> 00:09:05,120 Speaker 1: So the further to the right you got along this axis, 137 00:09:05,200 --> 00:09:09,320 Speaker 1: the more like a human the robot would appear. The 138 00:09:09,559 --> 00:09:14,160 Speaker 1: y axis, the vertical axis is the affinity or the 139 00:09:14,160 --> 00:09:19,200 Speaker 1: emotional response, like from no response to positive and if 140 00:09:19,200 --> 00:09:22,559 Speaker 1: you dip below the line, like below the x axis, 141 00:09:22,600 --> 00:09:25,920 Speaker 1: you go to negative response. Right, So this is how 142 00:09:25,960 --> 00:09:30,200 Speaker 1: he would plot robots. He would argue that industrial robots 143 00:09:30,240 --> 00:09:35,760 Speaker 1: score low both in human likeness and in affinity, that 144 00:09:36,160 --> 00:09:38,160 Speaker 1: they don't look like humans and we don't tend to 145 00:09:38,200 --> 00:09:40,839 Speaker 1: have an emotional response to them. So if you were 146 00:09:40,920 --> 00:09:45,120 Speaker 1: to plot points on this graph, it would be pretty 147 00:09:45,160 --> 00:09:47,959 Speaker 1: close to the x axis, pretty close to the y axis, 148 00:09:48,000 --> 00:09:50,360 Speaker 1: so it would be the low part of the line. 149 00:09:50,679 --> 00:09:53,760 Speaker 1: But then you get into things like toy robots, right, 150 00:09:54,120 --> 00:09:57,760 Speaker 1: and these toys may look slightly more like a human 151 00:09:57,840 --> 00:10:01,040 Speaker 1: than industrial robots do. They might be anthropomorphic, they might 152 00:10:01,040 --> 00:10:05,840 Speaker 1: be bipedal um. But you would never mistake a simple 153 00:10:05,920 --> 00:10:08,800 Speaker 1: toy robot for a human. You would just recognize the 154 00:10:08,880 --> 00:10:12,480 Speaker 1: human like features, and our affinity goes up, our emotional 155 00:10:12,559 --> 00:10:17,400 Speaker 1: response starts to be more positive compared to say, industrial robots, 156 00:10:17,840 --> 00:10:23,280 Speaker 1: presumably because they look more like people or humans. So 157 00:10:23,360 --> 00:10:26,200 Speaker 1: we have a slight upward slope of our line that 158 00:10:26,200 --> 00:10:29,360 Speaker 1: we begin to plot. All right, before we get too 159 00:10:29,360 --> 00:10:32,440 Speaker 1: far into this slope in this line, let's take a 160 00:10:32,480 --> 00:10:34,800 Speaker 1: quick break. We'll come right back and talk a little 161 00:10:34,840 --> 00:10:47,319 Speaker 1: more about Unkenny Valley. Okay, where we left off, we 162 00:10:47,320 --> 00:10:52,240 Speaker 1: were saying that toy robots are slightly more humanlike in 163 00:10:52,280 --> 00:10:55,160 Speaker 1: appearance than say, industrial robots, and that we have a 164 00:10:55,240 --> 00:11:01,520 Speaker 1: more positive emotional response to them. So, according to Maury's graph, 165 00:11:01,760 --> 00:11:05,960 Speaker 1: once we get past the fifty point where uh we 166 00:11:06,000 --> 00:11:10,040 Speaker 1: start to really approach human likeness, we still see improvement 167 00:11:10,160 --> 00:11:13,480 Speaker 1: in affinity. Right as robots get to look more and 168 00:11:13,520 --> 00:11:17,520 Speaker 1: more human like, we start to be more and more 169 00:11:17,559 --> 00:11:20,320 Speaker 1: positive in our acceptance of them. So think about like 170 00:11:21,360 --> 00:11:26,960 Speaker 1: robots that have very expressive eyes. For example, they don't 171 00:11:27,000 --> 00:11:30,200 Speaker 1: necessarily look like a human, but think of like a 172 00:11:30,280 --> 00:11:33,840 Speaker 1: robot that has a vaguely kind of human shape, with 173 00:11:33,880 --> 00:11:35,959 Speaker 1: like a torso and a head and stuff, and they 174 00:11:35,960 --> 00:11:38,440 Speaker 1: have eyes that are large and appear to be expressive. 175 00:11:38,960 --> 00:11:41,120 Speaker 1: Those are the kind of things that we tend to 176 00:11:41,160 --> 00:11:45,120 Speaker 1: respond positively too. But then as we start to make 177 00:11:45,160 --> 00:11:47,360 Speaker 1: the robots look more and more human like, maybe we 178 00:11:47,600 --> 00:11:51,080 Speaker 1: put an artificial skin on them, We might put a 179 00:11:51,120 --> 00:11:53,640 Speaker 1: wig on them to give them hair. We might make 180 00:11:53,679 --> 00:11:58,520 Speaker 1: their eyes appear to be more human, and those eyes 181 00:11:58,559 --> 00:12:02,200 Speaker 1: may or may not house camera, who knows. We may 182 00:12:02,240 --> 00:12:06,080 Speaker 1: give them the ability to make certain expressions. Then we 183 00:12:06,120 --> 00:12:09,480 Speaker 1: start to reach a point where we can get to 184 00:12:09,640 --> 00:12:13,880 Speaker 1: that negative reaction, the affinity would dip below zero. So 185 00:12:13,920 --> 00:12:17,840 Speaker 1: on our little chart, the slope now goes all the 186 00:12:17,840 --> 00:12:22,320 Speaker 1: way down below the X axis and then bottoms out 187 00:12:22,400 --> 00:12:26,040 Speaker 1: somewhere around there where we get to some horrifying monstrosity 188 00:12:26,080 --> 00:12:28,760 Speaker 1: that just makes us wet ourselves when we look at it. Now, 189 00:12:28,800 --> 00:12:31,520 Speaker 1: once you get past that point where the robot or 190 00:12:31,600 --> 00:12:36,040 Speaker 1: artificial human is beyond that creepy factor, the line starts 191 00:12:36,080 --> 00:12:38,880 Speaker 1: to go back up again, and eventually it overtakes where 192 00:12:38,880 --> 00:12:43,199 Speaker 1: we left off before the depth, so that we eventually 193 00:12:43,240 --> 00:12:48,040 Speaker 1: start seeking out experiences with those robots, or we seek out, 194 00:12:48,240 --> 00:12:50,800 Speaker 1: you know, films that use that kind of computer animation 195 00:12:51,520 --> 00:12:55,600 Speaker 1: because we find it really appealing. So it's no longer 196 00:12:55,640 --> 00:13:00,400 Speaker 1: that there's this gap, there's this like perception that the 197 00:13:00,480 --> 00:13:04,760 Speaker 1: thing is unnatural and it bothers us um. One example 198 00:13:04,800 --> 00:13:11,079 Speaker 1: I saw using Morey's graph, his line chart was on 199 00:13:11,240 --> 00:13:13,920 Speaker 1: the far side where things are going up again, where 200 00:13:13,960 --> 00:13:16,880 Speaker 1: we're starting to see improvement is the bun rock Who 201 00:13:17,000 --> 00:13:20,280 Speaker 1: puppet so Bunarrock Who is a form of puppet theater 202 00:13:20,800 --> 00:13:24,080 Speaker 1: that dates back hundreds of years in Japan and in 203 00:13:24,240 --> 00:13:28,199 Speaker 1: traditional bun rock who you have three puppeteers who control 204 00:13:28,600 --> 00:13:32,360 Speaker 1: each figure. You have a primary puppeteer who is in 205 00:13:32,440 --> 00:13:35,040 Speaker 1: charge of the character's head movements as well as their 206 00:13:35,160 --> 00:13:38,160 Speaker 1: right hand. You have a secondary puppeteer who controls the 207 00:13:38,240 --> 00:13:41,760 Speaker 1: left hand, and you have the third puppeteer responsible for 208 00:13:41,840 --> 00:13:46,679 Speaker 1: the legs and feet. By the way, typically in the 209 00:13:47,040 --> 00:13:49,840 Speaker 1: history of bun Rock who if you were to train 210 00:13:50,240 --> 00:13:52,400 Speaker 1: in that kind of puppetry, you would start as someone 211 00:13:52,400 --> 00:13:55,680 Speaker 1: who would manipulate feet and legs, work your way up 212 00:13:55,760 --> 00:13:59,440 Speaker 1: to being the left hand, and then ultimately, if you 213 00:13:59,480 --> 00:14:02,560 Speaker 1: were really skilled, you could graduate up to the point 214 00:14:02,559 --> 00:14:05,240 Speaker 1: where you could control of puppets head and right hand. 215 00:14:05,920 --> 00:14:09,920 Speaker 1: This art form obviously requires very careful coordination between the 216 00:14:09,920 --> 00:14:15,560 Speaker 1: three puppeteers, and really accomplished performers can create incredible lifelike 217 00:14:15,600 --> 00:14:19,880 Speaker 1: movements like it is captivating to watch bun Rock Ou theater. 218 00:14:20,480 --> 00:14:24,520 Speaker 1: But I think the inclusion of bun Rock WU on 219 00:14:24,640 --> 00:14:30,480 Speaker 1: the positive side, like Past the Uncanny Valley, demonstrates that 220 00:14:30,480 --> 00:14:34,080 Speaker 1: there's something of an element of cultural and social factors 221 00:14:34,120 --> 00:14:37,280 Speaker 1: that can affect our emotional reaction when we see something 222 00:14:37,320 --> 00:14:40,560 Speaker 1: that we would describe as falling into the uncanny valley. 223 00:14:40,600 --> 00:14:44,360 Speaker 1: Because I think, at least for some Western audiences, watching 224 00:14:44,400 --> 00:14:49,760 Speaker 1: a bun rock Who performance might be a little unsettling. Certainly, 225 00:14:49,840 --> 00:14:54,000 Speaker 1: the puppets can sometimes inspire that creepy feeling of themselves. 226 00:14:54,400 --> 00:14:59,400 Speaker 1: So because it is not part of my culture, this 227 00:14:59,400 --> 00:15:02,200 Speaker 1: this kind of bit theater. Even though I've seen these performances, 228 00:15:02,240 --> 00:15:06,480 Speaker 1: it's not something that is coming from my culture or 229 00:15:06,520 --> 00:15:09,000 Speaker 1: something I was exposed to a lot as a child. 230 00:15:09,720 --> 00:15:13,560 Speaker 1: I find bun Rocky puppets to be a little unsettling. 231 00:15:13,600 --> 00:15:15,880 Speaker 1: In fact, I'll never forget when I visited my sister, 232 00:15:16,320 --> 00:15:18,680 Speaker 1: a puppeteer at the Center for Puppetry Arts, which is 233 00:15:18,680 --> 00:15:21,400 Speaker 1: here in Atlanta. They have an amazing museum, and I 234 00:15:21,560 --> 00:15:24,920 Speaker 1: visited my sister, and it was after hours, but they 235 00:15:25,000 --> 00:15:27,760 Speaker 1: let me in and said, go ahead and you can 236 00:15:27,800 --> 00:15:30,840 Speaker 1: walk through the museum if you like. This is with 237 00:15:30,960 --> 00:15:33,920 Speaker 1: all the main lights turned off, like they had lighting, 238 00:15:34,160 --> 00:15:36,400 Speaker 1: but it was not the same lighting they would have 239 00:15:36,480 --> 00:15:39,560 Speaker 1: during the day. So I'm in this dim area filled 240 00:15:39,600 --> 00:15:44,720 Speaker 1: with puppets all staring at me, and it was very creepy. 241 00:15:44,760 --> 00:15:48,640 Speaker 1: There were also puppets of varying degrees of human like, 242 00:15:49,040 --> 00:15:51,640 Speaker 1: some of them, like bun rock Who, puppets, some of 243 00:15:51,640 --> 00:15:57,000 Speaker 1: them more abstract. It was a really creepy experience. Now, 244 00:15:57,000 --> 00:16:00,520 Speaker 1: my point here is that the Uncanny Valley might best 245 00:16:00,600 --> 00:16:03,560 Speaker 1: describe a feeling you get upon encountering something that seems 246 00:16:03,560 --> 00:16:08,720 Speaker 1: both humanlike and alien at the same time. It doesn't 247 00:16:08,760 --> 00:16:12,800 Speaker 1: necessarily describe a continuum that goes from not human at 248 00:16:12,800 --> 00:16:19,160 Speaker 1: all two indistinguishable from human. That that is possibly the 249 00:16:19,160 --> 00:16:21,360 Speaker 1: wrong way to look at this, right, that it's not 250 00:16:21,440 --> 00:16:25,120 Speaker 1: necessarily things are getting better, Things are getting better, Things 251 00:16:25,160 --> 00:16:27,640 Speaker 1: are getting better, woe, things are really terrible. Things are 252 00:16:27,640 --> 00:16:30,160 Speaker 1: getting better again, because again, it all depends on what 253 00:16:30,280 --> 00:16:33,040 Speaker 1: the robots built for, what the computer generated character is 254 00:16:33,080 --> 00:16:36,640 Speaker 1: made for, and how it performs. I think it's fair 255 00:16:36,680 --> 00:16:39,040 Speaker 1: to say that I experienced a similar unsettling feeling when 256 00:16:39,080 --> 00:16:42,360 Speaker 1: I watched early videos of Boston Dynamics robots, you know, 257 00:16:42,400 --> 00:16:45,600 Speaker 1: those four legged robots. They looked kind of like they 258 00:16:45,640 --> 00:16:48,200 Speaker 1: were moving like an animal and kind of like they weren't, 259 00:16:48,680 --> 00:16:53,320 Speaker 1: And that was really kind of an unsettling experience. And 260 00:16:53,360 --> 00:16:55,800 Speaker 1: they look nothing like humans. Right, So I am, by 261 00:16:55,840 --> 00:16:58,320 Speaker 1: no means an expert when it comes to subjects like 262 00:16:58,480 --> 00:17:03,240 Speaker 1: robotics or psychology. So I fully admit that my hypothesis 263 00:17:03,720 --> 00:17:06,760 Speaker 1: is coming without the benefit of expertise, right I. I 264 00:17:06,760 --> 00:17:09,200 Speaker 1: can only say from what I've researched, And it may 265 00:17:09,240 --> 00:17:13,400 Speaker 1: well be that we can describe artificial beings appearances as 266 00:17:13,440 --> 00:17:17,159 Speaker 1: falling on this continuum that includes the Uncanny Valley. But 267 00:17:17,240 --> 00:17:19,960 Speaker 1: I think there are other factors and variables at play here, 268 00:17:20,400 --> 00:17:24,240 Speaker 1: mostly regarding the gap between an artificial beings behavior and 269 00:17:24,320 --> 00:17:28,480 Speaker 1: appearance and the behavior and appearance of the real analog. 270 00:17:29,280 --> 00:17:33,160 Speaker 1: And it might well be that younger generations will experience 271 00:17:33,280 --> 00:17:36,000 Speaker 1: less of a sense of the Uncanny Valley because they 272 00:17:36,040 --> 00:17:39,560 Speaker 1: will spend more of their lives around artificial beings, either 273 00:17:40,000 --> 00:17:43,080 Speaker 1: in their entertainment or in the real world around them. 274 00:17:43,440 --> 00:17:46,400 Speaker 1: So it may be that the Uncanny Valley will gradually 275 00:17:46,480 --> 00:17:51,640 Speaker 1: kind of fade away, not because this continuum was solved, 276 00:17:51,760 --> 00:17:55,400 Speaker 1: that we suddenly jumped past this perceived gap, but that 277 00:17:55,640 --> 00:18:00,040 Speaker 1: the experience is no longer as foreign or alien of 278 00:18:00,119 --> 00:18:02,840 Speaker 1: the people who are experiencing it as it is to 279 00:18:03,000 --> 00:18:07,240 Speaker 1: older generations like like mine. So yeah, that's kind of 280 00:18:07,240 --> 00:18:10,959 Speaker 1: a take on Uncanny Valley. I really do think that 281 00:18:11,040 --> 00:18:14,280 Speaker 1: if you do have that those subtle gaps, like especially 282 00:18:14,320 --> 00:18:17,040 Speaker 1: around eye movement, things like that you can look and 283 00:18:17,080 --> 00:18:20,280 Speaker 1: say this looks photo realistic. This looks like a really 284 00:18:20,840 --> 00:18:23,159 Speaker 1: well done video, Like it looks like an actual person, 285 00:18:23,640 --> 00:18:28,520 Speaker 1: except there's something off about the eyes, either the reflectiveness 286 00:18:28,600 --> 00:18:31,800 Speaker 1: of them, the wetness of the eyes, the movement of 287 00:18:31,800 --> 00:18:34,000 Speaker 1: the eyes. That's really where I see a lot of 288 00:18:34,040 --> 00:18:37,160 Speaker 1: that following through. There's also a lot to be said 289 00:18:37,200 --> 00:18:45,040 Speaker 1: about subtle facial expressions that aren't always captured in computer 290 00:18:45,080 --> 00:18:49,720 Speaker 1: generated versions of characters, where it just seems a little 291 00:18:49,760 --> 00:18:53,159 Speaker 1: too flat. But anyway, I thought that it would be 292 00:18:53,200 --> 00:18:56,560 Speaker 1: fun to kind of talk about Uncanny Valley and where 293 00:18:56,600 --> 00:18:59,679 Speaker 1: that idea came from. I'm curious to hear your thoughts 294 00:18:59,680 --> 00:19:01,919 Speaker 1: on it if you would like to let me know, 295 00:19:02,080 --> 00:19:04,199 Speaker 1: or maybe you've got a suggestion for a future episode. 296 00:19:04,200 --> 00:19:05,720 Speaker 1: There's a couple of ways you can get in touch. 297 00:19:06,119 --> 00:19:08,320 Speaker 1: One is to download the I Heart Radio app. It's 298 00:19:08,359 --> 00:19:10,880 Speaker 1: free to download and use. You can navigate to tech 299 00:19:10,920 --> 00:19:13,480 Speaker 1: stuff in the little search bar. There's a little microphone icon. 300 00:19:13,520 --> 00:19:15,760 Speaker 1: You can leave a message up to thirty seconds in length. 301 00:19:16,280 --> 00:19:17,600 Speaker 1: Let me know if you would like me to play 302 00:19:17,600 --> 00:19:20,160 Speaker 1: it in a future episode and let me know that way, 303 00:19:20,320 --> 00:19:22,560 Speaker 1: Or you can get in touch on Twitter the handle 304 00:19:22,600 --> 00:19:26,240 Speaker 1: for the show is tech Stuff H s W and 305 00:19:26,280 --> 00:19:35,600 Speaker 1: I'll talk to you again really soon. Y. Tech Stuff 306 00:19:35,680 --> 00:19:38,840 Speaker 1: is an I Heart Radio production. For more podcasts from 307 00:19:38,880 --> 00:19:42,640 Speaker 1: i Heeart Radio, visit the i heart Radio app, Apple Podcasts, 308 00:19:42,760 --> 00:19:44,760 Speaker 1: or wherever you listen to your favorite shows.