1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,800 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:15,000 --> 00:00:18,160 Speaker 1: job in Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,280 --> 00:00:21,040 Speaker 1: and I love all things tech and it is time 5 00:00:21,079 --> 00:00:25,040 Speaker 1: for a classic episode of tech Stuff. This episode originally 6 00:00:25,040 --> 00:00:29,120 Speaker 1: published way back on October fifteenth, two thousand fourteen. It 7 00:00:29,240 --> 00:00:34,800 Speaker 1: is titled do we need humanoid Robots? An interesting question 8 00:00:34,880 --> 00:00:37,120 Speaker 1: because we often, at least I don't know, I can't 9 00:00:37,159 --> 00:00:40,400 Speaker 1: say we. I often think of robots as sort of 10 00:00:40,440 --> 00:00:44,000 Speaker 1: anthropomorphic robots, like I think of droids in the Star Wars, 11 00:00:44,280 --> 00:00:46,080 Speaker 1: like C three b O that kind of droid. Not 12 00:00:46,120 --> 00:00:49,400 Speaker 1: are two. He's clearly not humanoid. But you know, I 13 00:00:49,440 --> 00:00:53,040 Speaker 1: think of humanoid robots more frequently than other types, despite 14 00:00:53,040 --> 00:00:55,160 Speaker 1: the fact that the vast majority of robots I have have 15 00:00:55,320 --> 00:00:59,240 Speaker 1: encountered have not been humanoid robots. So that does raise 16 00:00:59,240 --> 00:01:04,200 Speaker 1: a question does a humanoid robot even makes sense? We 17 00:01:04,280 --> 00:01:08,920 Speaker 1: try and answer that in this particular episode, Enjoy. I 18 00:01:09,000 --> 00:01:11,200 Speaker 1: asked Josh what he would like to cover, because with 19 00:01:11,480 --> 00:01:13,280 Speaker 1: the fact that I've got all these guests coming in 20 00:01:13,319 --> 00:01:15,840 Speaker 1: to sit down with me, um, you know, some people 21 00:01:16,000 --> 00:01:18,800 Speaker 1: like to come up with their own suggestions. Some people 22 00:01:18,840 --> 00:01:21,679 Speaker 1: preferred if I pick a topic and then they research it. 23 00:01:22,080 --> 00:01:23,800 Speaker 1: I asked Josh what he would like to talk about, 24 00:01:24,000 --> 00:01:27,760 Speaker 1: and you were really interested in the idea of humanoid robots. Well, 25 00:01:27,880 --> 00:01:31,959 Speaker 1: you have this awesome spreadsheet of um of listeners suggestions, 26 00:01:32,680 --> 00:01:34,640 Speaker 1: and it might as well been a neon when it 27 00:01:34,760 --> 00:01:37,240 Speaker 1: was going down the sheet. I'm like humanoid robots, of course, 28 00:01:37,520 --> 00:01:40,880 Speaker 1: and this is a great topic. In order to really 29 00:01:40,880 --> 00:01:43,120 Speaker 1: get into it, I was going to define a few terms, 30 00:01:43,200 --> 00:01:45,400 Speaker 1: even though a lot of these are ones that I 31 00:01:45,480 --> 00:01:49,000 Speaker 1: think most of us just kind of understand just from 32 00:01:49,040 --> 00:01:51,360 Speaker 1: the fact that this is in our culture now. It's 33 00:01:51,400 --> 00:01:54,520 Speaker 1: not just not just a reality as far as technology goes, 34 00:01:54,520 --> 00:01:56,480 Speaker 1: but it plays a large part in fiction. In fact, 35 00:01:56,560 --> 00:02:00,320 Speaker 1: that's where the term robot comes from, is from fiction. Uh. 36 00:02:00,480 --> 00:02:07,560 Speaker 1: It was from Carrel Chopeck, a Czechoslovakian playwright. I did, 37 00:02:07,600 --> 00:02:11,520 Speaker 1: in fact listen to it a couple of times. Carrel Chopec, Yeah, 38 00:02:11,520 --> 00:02:14,079 Speaker 1: because it's his last name is spelled c A p 39 00:02:14,280 --> 00:02:16,960 Speaker 1: e K. And it includes uh symbols that are not 40 00:02:17,040 --> 00:02:22,639 Speaker 1: in the English alphabet, like squiggly lines and little UFOs 41 00:02:22,680 --> 00:02:26,000 Speaker 1: and things he wrote. Are you are also known as 42 00:02:26,080 --> 00:02:29,800 Speaker 1: Rossum's Universal robots, and the word robot comes from the 43 00:02:29,840 --> 00:02:34,760 Speaker 1: check word robot to which means forced labor. Yeah, so 44 00:02:34,800 --> 00:02:39,360 Speaker 1: a robot is a an entity, a synthetic construct that 45 00:02:39,480 --> 00:02:43,280 Speaker 1: is forced to do work. Then we have humanoid, which 46 00:02:43,320 --> 00:02:46,720 Speaker 1: just means resembling a human being. That's a term that 47 00:02:47,040 --> 00:02:50,960 Speaker 1: is relatively young. It started showing up around the turn 48 00:02:51,000 --> 00:02:54,799 Speaker 1: of the twentieth century. And uh, it started I think 49 00:02:54,800 --> 00:02:57,960 Speaker 1: the first few times it was ever mentioned was around 50 00:02:58,040 --> 00:03:01,840 Speaker 1: nineteen twelve, and it was mostly used then to describe fossils, 51 00:03:02,280 --> 00:03:07,280 Speaker 1: saying these are humanoid fossils, like yeah. And then we 52 00:03:07,360 --> 00:03:10,560 Speaker 1: have android, which is we're probably not gonna be using 53 00:03:10,560 --> 00:03:13,480 Speaker 1: that word very often, but android is a robot that's 54 00:03:13,520 --> 00:03:16,639 Speaker 1: in the form of a human. So all androids or robots, 55 00:03:16,639 --> 00:03:19,040 Speaker 1: but not all robots are androids. And you know, I 56 00:03:19,120 --> 00:03:21,680 Speaker 1: ran into I looked up android as well. Yeah, and 57 00:03:21,760 --> 00:03:25,120 Speaker 1: apparently that's from like the early eighth century. It's a 58 00:03:25,160 --> 00:03:28,880 Speaker 1: little odd that it actually predates robots. Yeah, but uh, 59 00:03:29,680 --> 00:03:32,440 Speaker 1: when we look at myths and legends, there's so many 60 00:03:32,520 --> 00:03:36,320 Speaker 1: stories that involve a human like entity that's not actually 61 00:03:36,360 --> 00:03:39,080 Speaker 1: a person that you can see where it gets translated 62 00:03:39,080 --> 00:03:41,680 Speaker 1: in there. This gets a little confusing too, because Star 63 00:03:41,680 --> 00:03:45,760 Speaker 1: Wars they called all their robots droids, but there they 64 00:03:45,800 --> 00:03:48,080 Speaker 1: aren't androids are two? T two is not an android 65 00:03:48,160 --> 00:03:52,240 Speaker 1: because he's not um or it's not human shaped. You 66 00:03:52,240 --> 00:03:54,160 Speaker 1: could even argue that C three Po is not a 67 00:03:54,160 --> 00:03:56,960 Speaker 1: true android because some people say to be an android 68 00:03:57,000 --> 00:03:59,720 Speaker 1: you have to appear, at least on casual glance to 69 00:03:59,840 --> 00:04:03,880 Speaker 1: be human. He's way too shiny, way too shiny. Data 70 00:04:04,000 --> 00:04:07,880 Speaker 1: from Star Trek Next Generation might be uh android, but 71 00:04:08,320 --> 00:04:11,360 Speaker 1: he's an android who's had, you know, a long time 72 00:04:11,400 --> 00:04:14,200 Speaker 1: in the man cave. He hasn't seen much sun, right, Yeah, 73 00:04:14,200 --> 00:04:17,520 Speaker 1: because he's definitely got a weird complexion. Thing goes the 74 00:04:17,600 --> 00:04:20,840 Speaker 1: kid in a I would be an android David, yes, 75 00:04:21,000 --> 00:04:24,000 Speaker 1: which turns out to be a popular name for droids, 76 00:04:24,640 --> 00:04:26,800 Speaker 1: because there was well there's David in AI, and there 77 00:04:26,839 --> 00:04:29,400 Speaker 1: was also David and Prometheus. Oh you know, I never 78 00:04:29,400 --> 00:04:33,000 Speaker 1: saw Prometheus. I don't know if your listeners are going 79 00:04:33,040 --> 00:04:35,279 Speaker 1: to agree with me or not, because I could see 80 00:04:35,520 --> 00:04:38,760 Speaker 1: um getting shouted down. But I thought Prometheus was a 81 00:04:38,760 --> 00:04:42,120 Speaker 1: great movie. Even upon second viewing, I thought it was good. 82 00:04:42,240 --> 00:04:45,799 Speaker 1: You know, I know that the from the artistic level, 83 00:04:46,120 --> 00:04:47,640 Speaker 1: A lot of people really loved it, And then there 84 00:04:47,640 --> 00:04:49,640 Speaker 1: were some people who said, how can you get lost 85 00:04:49,800 --> 00:04:52,880 Speaker 1: if you have a three dimensional map with you at 86 00:04:52,920 --> 00:04:56,240 Speaker 1: all time? But you know, plot versus artist, I don't know. 87 00:04:56,880 --> 00:04:59,880 Speaker 1: Then you've got replicants from Blade Runner. These are more 88 00:05:00,040 --> 00:05:03,200 Speaker 1: like cyborgs because they have some sort of organic material 89 00:05:03,640 --> 00:05:08,560 Speaker 1: attached to them. They're not completely you know, synthetic material. 90 00:05:09,200 --> 00:05:12,960 Speaker 1: So Terminator is another example. They have a fleshy over 91 00:05:13,360 --> 00:05:16,039 Speaker 1: skin on top of their metallic bodies. But was that 92 00:05:16,120 --> 00:05:18,760 Speaker 1: real skin that he had there was a synthetic skin, 93 00:05:18,800 --> 00:05:20,680 Speaker 1: because that would make a difference. That's a good question. 94 00:05:20,800 --> 00:05:23,119 Speaker 1: And I uh, you know, I know that they refer 95 00:05:23,200 --> 00:05:25,520 Speaker 1: to them as cyborgs at least a few times in 96 00:05:25,560 --> 00:05:28,560 Speaker 1: the movies, which would suggest that it's actual skin. Maybe 97 00:05:28,640 --> 00:05:35,240 Speaker 1: it's lab grown human skin. So you know, there's some 98 00:05:35,440 --> 00:05:39,240 Speaker 1: fuzzy lines around these definitions. I would think that cyborgs 99 00:05:39,279 --> 00:05:41,960 Speaker 1: would be the hardest of all of the the um 100 00:05:42,600 --> 00:05:46,320 Speaker 1: humanoid robots to make, because the flesh would just wrought 101 00:05:46,720 --> 00:05:50,680 Speaker 1: like you have your normal looking humanoid robot your cyborg, 102 00:05:50,960 --> 00:05:53,560 Speaker 1: and it's ear would just fall off. Yeah, and it's 103 00:05:53,600 --> 00:05:56,880 Speaker 1: not as easy as you might think to wire together 104 00:05:57,040 --> 00:06:01,200 Speaker 1: the wet wear that's in our heads, hardware that runs 105 00:06:01,200 --> 00:06:05,320 Speaker 1: on circuits. We will often think of computers working in 106 00:06:05,360 --> 00:06:07,359 Speaker 1: a way that's similar to our brains, but in fact 107 00:06:07,400 --> 00:06:10,239 Speaker 1: the two work in very different ways. Well it seems 108 00:06:10,279 --> 00:06:14,960 Speaker 1: like running into this though, Strickland um, the the more 109 00:06:15,400 --> 00:06:18,880 Speaker 1: we got into humanoid robotics, the more we started to 110 00:06:18,960 --> 00:06:21,919 Speaker 1: understand just how complex we are. Yeah, that's one of 111 00:06:21,920 --> 00:06:24,920 Speaker 1: the things that I think is a benefit of study 112 00:06:24,920 --> 00:06:28,640 Speaker 1: of humanoid robotics. The idea of pursuing the goal of 113 00:06:28,680 --> 00:06:31,279 Speaker 1: creating a humanoid robot is not just that we learn 114 00:06:31,720 --> 00:06:34,680 Speaker 1: more about all the different areas and robotics and they're 115 00:06:34,720 --> 00:06:36,719 Speaker 1: a lot, and we'll talk about some of them, but 116 00:06:36,760 --> 00:06:40,240 Speaker 1: we also learn more about ourselves. We're trying to figure out, Okay, 117 00:06:40,279 --> 00:06:43,120 Speaker 1: if we're going to make something that is able to 118 00:06:43,160 --> 00:06:45,960 Speaker 1: perform tasks the way human does, then we really got 119 00:06:45,960 --> 00:06:49,159 Speaker 1: to take a close look at humans. That's that's the 120 00:06:49,200 --> 00:06:54,880 Speaker 1: first place to start. So what makes a humanoid robot? 121 00:06:55,120 --> 00:06:57,840 Speaker 1: And generally speaking, we're talking about a robot that has 122 00:06:58,400 --> 00:07:02,479 Speaker 1: basic features usually minimum a torso, arms and legs, and 123 00:07:02,839 --> 00:07:06,120 Speaker 1: is walking up right. Uh, it may have a head 124 00:07:06,200 --> 00:07:09,240 Speaker 1: or it might not. Early humanoid robots didn't, or at 125 00:07:09,320 --> 00:07:14,840 Speaker 1: least their sensory uh. Instruments were all located within the 126 00:07:14,880 --> 00:07:18,600 Speaker 1: top part of the torso there wasn't like a separate head. 127 00:07:18,920 --> 00:07:22,040 Speaker 1: Did you see a picture of Minerva at the Smithsonian? No? 128 00:07:22,280 --> 00:07:24,920 Speaker 1: I did not. It's a robot to her guide, but um, 129 00:07:24,960 --> 00:07:28,680 Speaker 1: she came up in the humanoid robot research and I 130 00:07:28,720 --> 00:07:31,120 Speaker 1: think she's stretching it a little bit. Yeah, she looks 131 00:07:31,160 --> 00:07:34,000 Speaker 1: a bit like a washing machine with a couple of 132 00:07:34,160 --> 00:07:38,960 Speaker 1: UM cameras on top. So just those alone, I guess 133 00:07:39,000 --> 00:07:44,240 Speaker 1: makes her eligible for the humanoid robot realm. But that 134 00:07:44,240 --> 00:07:45,920 Speaker 1: that seems like that's a bit of a you know, 135 00:07:46,320 --> 00:07:49,240 Speaker 1: if it has an appendage, that doesn't necessarily make a humanoid. 136 00:07:49,240 --> 00:07:51,560 Speaker 1: I mean, you could look at the Mars Curiosity rover, 137 00:07:51,640 --> 00:07:54,760 Speaker 1: which has several appendages, but I don't think anyone would 138 00:07:54,760 --> 00:07:59,200 Speaker 1: ever describe it as humanoid. Right, So, uh, ideally a 139 00:07:59,280 --> 00:08:02,560 Speaker 1: humanoid robe lot would be able to interact with humans 140 00:08:02,960 --> 00:08:06,600 Speaker 1: within a human environment. Because here's the thing about we, 141 00:08:06,600 --> 00:08:09,720 Speaker 1: we people. We have defined our environments to a large extent, 142 00:08:09,800 --> 00:08:13,600 Speaker 1: especially in developed nations, where the stuff that's around us 143 00:08:13,640 --> 00:08:18,640 Speaker 1: we have shaped so that it works within our capabilities, 144 00:08:19,160 --> 00:08:22,240 Speaker 1: right with our with our human environment thus far. If 145 00:08:22,240 --> 00:08:25,320 Speaker 1: you look at technology though, on the whole, we've pretty 146 00:08:25,400 --> 00:08:28,720 Speaker 1: much been forced to adapt to it. So, for example, 147 00:08:28,760 --> 00:08:32,439 Speaker 1: like a keyboard, we don't normally naturally, you know, um 148 00:08:32,760 --> 00:08:36,520 Speaker 1: express ideas through our fingers on a little a little board, right, 149 00:08:36,720 --> 00:08:38,640 Speaker 1: we don't normally do that, So we had to adapt 150 00:08:38,720 --> 00:08:40,800 Speaker 1: to the technology and learn to type and get good 151 00:08:40,800 --> 00:08:44,959 Speaker 1: at it. With humanoid robots, it's basically going the exact opposite. 152 00:08:45,360 --> 00:08:48,800 Speaker 1: It's saying we already have an environment, we already are um, 153 00:08:48,840 --> 00:08:50,640 Speaker 1: you know, good at all this other stuff. If we're 154 00:08:50,640 --> 00:08:53,200 Speaker 1: gonna make humanoid robots, one of the great benefits is 155 00:08:53,640 --> 00:08:57,040 Speaker 1: they can adapt to us. Right. Yeah, we don't have 156 00:08:57,160 --> 00:09:00,600 Speaker 1: to uh create a unitask or robe, but that's really 157 00:09:00,600 --> 00:09:03,439 Speaker 1: good at one thing, um that may or may not 158 00:09:03,480 --> 00:09:06,640 Speaker 1: be something that humans can do easily. We can make 159 00:09:06,679 --> 00:09:10,400 Speaker 1: a robot that's good lots of things. Uh. I also 160 00:09:10,480 --> 00:09:13,760 Speaker 1: think usually when I think of humanoid robots, when I 161 00:09:13,800 --> 00:09:16,640 Speaker 1: think of robots in general, I normally think that they 162 00:09:16,640 --> 00:09:19,920 Speaker 1: are at least semi autonomous. That's that's one of the 163 00:09:19,920 --> 00:09:22,440 Speaker 1: things I usually think of. It doesn't necessarily have to be. 164 00:09:22,520 --> 00:09:26,720 Speaker 1: You could have a tel operated robot, but I almost 165 00:09:26,720 --> 00:09:30,280 Speaker 1: think of that closer to the realm of like a 166 00:09:30,320 --> 00:09:34,120 Speaker 1: remote controlled car or a puppet even. Um So, I 167 00:09:34,200 --> 00:09:37,600 Speaker 1: often one of the definitions I use is that it's 168 00:09:37,600 --> 00:09:43,160 Speaker 1: an autonomous or semi autonomous machine in human form. It's 169 00:09:43,240 --> 00:09:48,560 Speaker 1: mechanical and electronic, and it can thus do the sort 170 00:09:48,559 --> 00:09:50,720 Speaker 1: of things humans do, but do it in a totally 171 00:09:50,720 --> 00:09:53,680 Speaker 1: synthetic way. And you have to be careful when you 172 00:09:53,720 --> 00:09:57,120 Speaker 1: say autonomous or semi autonomous, because the state of the 173 00:09:57,200 --> 00:10:01,120 Speaker 1: art right now appears to be that robots display autonomy 174 00:10:01,120 --> 00:10:03,320 Speaker 1: because they just kind of wander off in places they're 175 00:10:03,320 --> 00:10:06,680 Speaker 1: not supposed But it's not because they want to. It's 176 00:10:06,720 --> 00:10:10,679 Speaker 1: because their their program just ran a foul of program. Right. 177 00:10:10,800 --> 00:10:13,400 Speaker 1: There's no determination there on the part of the robot. 178 00:10:13,920 --> 00:10:18,480 Speaker 1: It's not exploring its environment on its own accord. It's 179 00:10:18,520 --> 00:10:21,280 Speaker 1: someone made a mistake in the code somewhere and where 180 00:10:21,320 --> 00:10:23,480 Speaker 1: the robot was supposed to take a left hand turn 181 00:10:23,559 --> 00:10:28,040 Speaker 1: at this one you know, predetermined spot, and instead continued 182 00:10:28,080 --> 00:10:31,520 Speaker 1: forward or something exactly. Um So, I wanted to talk 183 00:10:31,520 --> 00:10:34,440 Speaker 1: a little bit about the history of humanoid robots, and 184 00:10:34,480 --> 00:10:38,240 Speaker 1: if you wanna look way way way back, I mean 185 00:10:38,240 --> 00:10:40,680 Speaker 1: we're talking about the first people to really kind of 186 00:10:40,720 --> 00:10:46,079 Speaker 1: attempt to build a humanoid machine that could mimic or 187 00:10:46,200 --> 00:10:48,599 Speaker 1: the movements at least of a person. You gotta go 188 00:10:48,640 --> 00:10:51,720 Speaker 1: all the way back to Greece between the years ten 189 00:10:51,920 --> 00:10:56,960 Speaker 1: and seventy Common era. That's when Hero of Alexandria started 190 00:10:57,280 --> 00:11:00,240 Speaker 1: to create various machines. He's also the person who made 191 00:11:00,240 --> 00:11:06,199 Speaker 1: the first working steam engine style tool, which is pretty impressive. Yeah. 192 00:11:06,240 --> 00:11:08,240 Speaker 1: He he had come up with a lot of very 193 00:11:08,280 --> 00:11:12,080 Speaker 1: clever designs. Whether they were built or not depends upon 194 00:11:12,600 --> 00:11:16,079 Speaker 1: uh certain accounts and and if they're true. But the 195 00:11:16,080 --> 00:11:20,000 Speaker 1: stuff he designed is completely build a bowl. So he 196 00:11:20,040 --> 00:11:22,079 Speaker 1: didn't come up with any ideas where it was so 197 00:11:22,360 --> 00:11:25,040 Speaker 1: you know, outlandish, that was impossible. He wasn't just like 198 00:11:25,120 --> 00:11:27,720 Speaker 1: drawing in the margins of his diary or something. No, Yeah, 199 00:11:27,720 --> 00:11:31,080 Speaker 1: he came up with specific plans that people today have recreated. 200 00:11:31,120 --> 00:11:34,559 Speaker 1: They built their own versions. Yeah. So he created a 201 00:11:34,600 --> 00:11:38,160 Speaker 1: lot of designs for automata. Although these are things that 202 00:11:38,200 --> 00:11:41,880 Speaker 1: were controlled by pulleys and ropes and cogged wheels, and 203 00:11:42,480 --> 00:11:46,120 Speaker 1: uh needed some form of outside influence to make them work, 204 00:11:46,160 --> 00:11:50,840 Speaker 1: so they're not all fully self contained like the show 205 00:11:50,880 --> 00:11:54,520 Speaker 1: biz pizza Rocke fire explosion day and yeah exactly, yeah, yeah, 206 00:11:54,520 --> 00:11:58,079 Speaker 1: it was some one of those audio animatronic figures that 207 00:11:58,080 --> 00:12:00,920 Speaker 1: that looks very robotic, but you realize it's really just 208 00:12:01,160 --> 00:12:05,160 Speaker 1: one tiny piece of a giant system. Uh. In four, 209 00:12:06,080 --> 00:12:08,240 Speaker 1: we get up to Leonardo da Vinci. He designed an 210 00:12:08,240 --> 00:12:12,640 Speaker 1: automaton in the form of a mechanical night which uh 211 00:12:12,840 --> 00:12:16,559 Speaker 1: again supposedly he built. There's no actual record of an 212 00:12:16,559 --> 00:12:20,520 Speaker 1: existing one from history, but they have created ones based 213 00:12:20,559 --> 00:12:23,600 Speaker 1: on the design since then, and it could do things 214 00:12:23,600 --> 00:12:26,240 Speaker 1: like move its arms and raise its visor. Who's doing this? 215 00:12:26,559 --> 00:12:32,120 Speaker 1: Who's doing this? Crazy engineers who are also uh, very 216 00:12:32,120 --> 00:12:35,680 Speaker 1: excited about history and very wealthy too, I would imagine. Yeah, 217 00:12:35,720 --> 00:12:40,160 Speaker 1: so in this case, you're talking about Mark Rossheim, who 218 00:12:40,320 --> 00:12:44,640 Speaker 1: recreated this particular machine, and it it's a night, it's 219 00:12:44,679 --> 00:12:49,920 Speaker 1: a night in German medieval armor. And it can sit down, 220 00:12:50,040 --> 00:12:51,800 Speaker 1: it can stand up, it can move its arms, it 221 00:12:51,840 --> 00:12:56,800 Speaker 1: can raise a visor, it can work its jaw ling, 222 00:12:57,120 --> 00:13:00,600 Speaker 1: I imagine. So I tried to find video of ross 223 00:13:00,640 --> 00:13:03,400 Speaker 1: Heims version working and I couldn't find it. He did, however, 224 00:13:04,120 --> 00:13:07,800 Speaker 1: make another of Da Vinci's inventions, which was a self 225 00:13:07,880 --> 00:13:11,960 Speaker 1: driving cart that used Yeah you you wound a spring, 226 00:13:12,400 --> 00:13:15,160 Speaker 1: and it had cam stops that would allow it to 227 00:13:15,320 --> 00:13:18,680 Speaker 1: steer a predetermined path. You would actually program the cart 228 00:13:19,480 --> 00:13:22,640 Speaker 1: by putting cam stops in particular locations along the cam 229 00:13:22,960 --> 00:13:24,719 Speaker 1: and that would tell it when to turn left or 230 00:13:24,720 --> 00:13:28,520 Speaker 1: when to turn right, so it couldn't it couldn't navigate 231 00:13:28,760 --> 00:13:32,800 Speaker 1: through uh an obstacle course unless you had already previously 232 00:13:32,800 --> 00:13:34,840 Speaker 1: seen the obstacle course and you could figure out when 233 00:13:34,840 --> 00:13:37,480 Speaker 1: it needed to turn ahead of time, so you're essentially 234 00:13:37,520 --> 00:13:41,280 Speaker 1: programming the device. Um. There are lots of examples in 235 00:13:41,280 --> 00:13:45,000 Speaker 1: the renaissance of automata and semi automata, things that are 236 00:13:45,000 --> 00:13:47,600 Speaker 1: really more like puppets. You've heard about the Mechanical Turk 237 00:13:48,520 --> 00:13:52,800 Speaker 1: have the chess playing robot uh So it looked like 238 00:13:52,840 --> 00:13:54,840 Speaker 1: it was a robot that could play chess and was 239 00:13:54,920 --> 00:13:57,840 Speaker 1: really really good at playing chess, and it turned out 240 00:13:57,880 --> 00:14:00,400 Speaker 1: eventually to be a hoax. It was actually it was 241 00:14:00,440 --> 00:14:03,240 Speaker 1: actually a puppet, and there was an actual chess master 242 00:14:03,840 --> 00:14:07,640 Speaker 1: hidden in a cabinet beneath the Mechanical Turk who sat 243 00:14:07,760 --> 00:14:11,560 Speaker 1: kind of Indian style with a chessboard in front of 244 00:14:11,600 --> 00:14:13,960 Speaker 1: him and could move the pieces to where it needed 245 00:14:14,000 --> 00:14:15,520 Speaker 1: to be. But it was all being guided by an 246 00:14:15,559 --> 00:14:18,480 Speaker 1: actual chess master who was hidden under When was that 247 00:14:18,480 --> 00:14:21,880 Speaker 1: that was late Renaissance early Enlightenment. That in and of 248 00:14:21,960 --> 00:14:24,880 Speaker 1: itself is pretty impressive. Yeah, it was. It was neat 249 00:14:24,920 --> 00:14:27,560 Speaker 1: that people were thinking about these sort of things. Uh. 250 00:14:27,640 --> 00:14:30,200 Speaker 1: By ninety six we get the first humanoid robot to 251 00:14:30,240 --> 00:14:34,400 Speaker 1: appear on film, uh, Metropolis, the character of Maria, and 252 00:14:34,480 --> 00:14:38,520 Speaker 1: at ninety nine at the World's Fair, Westinghouse Electric Corporation 253 00:14:38,600 --> 00:14:41,600 Speaker 1: showed off a robot called Electro. Now have you ever 254 00:14:41,640 --> 00:14:45,160 Speaker 1: seen this? So he kind of looks like the ten 255 00:14:45,240 --> 00:14:48,840 Speaker 1: man from the Wizard of Oz film. Yes, he smokes. 256 00:14:49,640 --> 00:14:51,320 Speaker 1: I have seen one of the things he can do 257 00:14:51,880 --> 00:14:55,520 Speaker 1: get a little bellows in his head that allowed him 258 00:14:55,560 --> 00:14:58,960 Speaker 1: to puff smoke. He could also kind of speak. He 259 00:14:59,000 --> 00:15:03,120 Speaker 1: had a seventy eight revolution per minute UH record player 260 00:15:03,280 --> 00:15:07,680 Speaker 1: essentially inside of him. I suppose if the if the 261 00:15:07,680 --> 00:15:10,240 Speaker 1: needles skipp it would at least sound like it. Um. 262 00:15:10,520 --> 00:15:14,120 Speaker 1: He repeat himself over. He called the audience twots. That 263 00:15:14,200 --> 00:15:19,760 Speaker 1: was the name. Yeah, he's apparently the main reason he 264 00:15:19,840 --> 00:15:23,520 Speaker 1: was retired from being shown off at exhibitions was because 265 00:15:23,560 --> 00:15:27,000 Speaker 1: it was very dated kind of lingo. But he was 266 00:15:27,080 --> 00:15:30,960 Speaker 1: used at the nineteen thirty nine World's Fair, which was 267 00:15:31,680 --> 00:15:33,760 Speaker 1: It's funny because I'll be talking about that again in 268 00:15:33,800 --> 00:15:36,800 Speaker 1: another episode very shortly. The World's Fair would have been 269 00:15:36,800 --> 00:15:41,800 Speaker 1: an amazing thing to visit. Um, I'm more I can 270 00:15:41,880 --> 00:15:44,440 Speaker 1: understand that. But if you're if you're going to get 271 00:15:44,440 --> 00:15:48,440 Speaker 1: to the point where you're looking at full scale anthropomorphic robots, 272 00:15:48,480 --> 00:15:51,560 Speaker 1: you gotta get up to about ninety three. That's when 273 00:15:51,560 --> 00:15:56,120 Speaker 1: the Waybot one from the Waseda University came out. That 274 00:15:56,200 --> 00:15:59,320 Speaker 1: was the first full scale anthropomorphic robot developed in the 275 00:15:59,360 --> 00:16:03,080 Speaker 1: world which had limb control, a vision system, so I 276 00:16:03,080 --> 00:16:06,000 Speaker 1: had an optical system that could recognize its environment and 277 00:16:06,040 --> 00:16:11,360 Speaker 1: objects and measure distance, and it also had a conversation system. Uh. 278 00:16:11,480 --> 00:16:14,840 Speaker 1: It was actually a collection of a bunch of very 279 00:16:14,880 --> 00:16:19,359 Speaker 1: complex machinery. Like its hands had been previously developed independently 280 00:16:19,360 --> 00:16:22,240 Speaker 1: of the robots, so had its legs. So it's like 281 00:16:22,760 --> 00:16:26,000 Speaker 1: all these people coming up with these various pieces saying, 282 00:16:26,040 --> 00:16:28,400 Speaker 1: all right, let's connect all this together and see what happens. 283 00:16:29,000 --> 00:16:32,480 Speaker 1: So that was a huge, huge leap forward. Yeah, you know, 284 00:16:32,480 --> 00:16:36,560 Speaker 1: if you'll notice, we went basically from um HOAXI chess 285 00:16:36,560 --> 00:16:40,440 Speaker 1: playing turks to you know, a robot that could converse 286 00:16:40,520 --> 00:16:44,720 Speaker 1: and interact with its environment. And then um, it seems 287 00:16:44,760 --> 00:16:46,680 Speaker 1: like we we kind of went off course for a 288 00:16:46,720 --> 00:16:48,680 Speaker 1: little bit and now we're coming full circle back to 289 00:16:48,760 --> 00:16:51,720 Speaker 1: that where like you said, a lot of different disciplines 290 00:16:51,760 --> 00:16:55,400 Speaker 1: are contributing these different pieces to what will eventually be 291 00:16:55,760 --> 00:16:58,800 Speaker 1: all of the best practices from each little sub discipline 292 00:16:58,840 --> 00:17:02,640 Speaker 1: put together in you know, the true humanoid robot. Well yeah, 293 00:17:02,640 --> 00:17:05,160 Speaker 1: I mean, if we're talking about a humanoid robot that's 294 00:17:05,200 --> 00:17:09,520 Speaker 1: capable of interacting with people as if they were, you know, 295 00:17:09,600 --> 00:17:12,560 Speaker 1: their own person, even though maybe an odd person, not 296 00:17:12,680 --> 00:17:15,040 Speaker 1: like the kind of person you would typically run into. 297 00:17:16,080 --> 00:17:19,800 Speaker 1: There's it's a multi disciplinary approach. I mean, artificial intelligence 298 00:17:19,840 --> 00:17:24,240 Speaker 1: by itself is multi disciplinary because you have sensing, you 299 00:17:24,280 --> 00:17:27,239 Speaker 1: have so all the perception, there's all these different just 300 00:17:27,320 --> 00:17:32,119 Speaker 1: that's multi disciplinary. Then you've got the processing, the cognition, 301 00:17:32,760 --> 00:17:36,280 Speaker 1: things like planning, navigation. There are so many things that 302 00:17:36,359 --> 00:17:40,760 Speaker 1: come together to make a humanoid robot a a possibility. 303 00:17:40,800 --> 00:17:44,639 Speaker 1: And that's just the the mental side, right. Then you 304 00:17:44,680 --> 00:17:47,000 Speaker 1: have all the physical side, the how do you make 305 00:17:47,040 --> 00:17:49,440 Speaker 1: it walk, how do you make it keep its balanced? 306 00:17:49,600 --> 00:17:53,440 Speaker 1: So lots of stuff to consider there. We're gonna take 307 00:17:53,480 --> 00:17:56,040 Speaker 1: a quick break but when we come back, we will 308 00:17:56,040 --> 00:18:07,679 Speaker 1: look more at the issue of humanoid robots. Wybot two 309 00:18:07,760 --> 00:18:10,480 Speaker 1: came out in that was a specialist robot. It could 310 00:18:10,480 --> 00:18:13,760 Speaker 1: play a keyboard, trying a keyboard. He could read sheet 311 00:18:13,840 --> 00:18:17,119 Speaker 1: music and play music. Uh. It was because it was 312 00:18:17,160 --> 00:18:20,080 Speaker 1: a specialist. It was not able to do the general 313 00:18:20,160 --> 00:18:23,040 Speaker 1: functions that it's predecessor could do. And that's why the 314 00:18:23,280 --> 00:18:27,760 Speaker 1: issues in robotics today is that it's very challenging to 315 00:18:27,840 --> 00:18:31,320 Speaker 1: build a general purpose robot. It's much easier to take 316 00:18:31,440 --> 00:18:33,800 Speaker 1: a specific task that you need to have done and 317 00:18:33,840 --> 00:18:36,000 Speaker 1: designed a robot to do that. Yeah, because I mean 318 00:18:36,000 --> 00:18:40,639 Speaker 1: we already have those rooms and rovers. I mean you know, 319 00:18:40,720 --> 00:18:44,439 Speaker 1: there's there's also all the robots and manufacturing, all the 320 00:18:44,480 --> 00:18:49,680 Speaker 1: welding robots things like that. Uh. Nine Pacific Northwest National 321 00:18:49,760 --> 00:18:54,240 Speaker 1: Laboratory built a robot called Manny that was the first 322 00:18:54,320 --> 00:18:57,600 Speaker 1: full scale android body and it had forty two degrees 323 00:18:57,640 --> 00:19:01,480 Speaker 1: of freedom but no AI or autonomy. It was completely teleoperated. 324 00:19:01,840 --> 00:19:03,720 Speaker 1: And um it took me a couple of times to 325 00:19:03,720 --> 00:19:06,119 Speaker 1: figure out what degrees of freedom meant. Yeah, Um, I 326 00:19:06,160 --> 00:19:08,680 Speaker 1: thought it meant like the it could move its arm 327 00:19:09,240 --> 00:19:13,600 Speaker 1: forty two degrees basically, but a degree of freedom is say, 328 00:19:13,680 --> 00:19:15,840 Speaker 1: like it can move its wrist, that's a degree of freedom. 329 00:19:15,880 --> 00:19:18,520 Speaker 1: It can turn its head left and right, that's a 330 00:19:18,520 --> 00:19:20,679 Speaker 1: degree of freedom right right. And if you look at 331 00:19:20,680 --> 00:19:24,240 Speaker 1: the human hand, the human hand has about thirty degrees 332 00:19:24,320 --> 00:19:27,600 Speaker 1: of freedom, meaning that you look at the way each 333 00:19:27,680 --> 00:19:29,879 Speaker 1: finger and your thumb can move. You look at the 334 00:19:29,920 --> 00:19:32,120 Speaker 1: way you can clench a fist, you can twist your 335 00:19:32,119 --> 00:19:35,919 Speaker 1: hand with your wrist um, those are all different degrees 336 00:19:35,920 --> 00:19:38,800 Speaker 1: of freedom. And uh. In fact, one of the cool 337 00:19:38,880 --> 00:19:41,639 Speaker 1: things about robots is that as we get better and 338 00:19:41,680 --> 00:19:44,399 Speaker 1: better at designing them, we can create robots that have 339 00:19:44,520 --> 00:19:46,840 Speaker 1: far more degrees of freedom than the human body does. 340 00:19:47,320 --> 00:19:50,679 Speaker 1: So I like the idea of a humanoid robot in 341 00:19:50,720 --> 00:19:54,200 Speaker 1: the future that has sixty degree motion with its wrist 342 00:19:54,320 --> 00:19:56,879 Speaker 1: and then just having it changed light bulbs. Is it 343 00:19:57,080 --> 00:19:59,159 Speaker 1: just spin and not have to do the little twisty 344 00:19:59,200 --> 00:20:01,640 Speaker 1: turning motion. How many robots does it take to change 345 00:20:01,640 --> 00:20:04,160 Speaker 1: the light bulbs? Just just the one, just that one, 346 00:20:04,480 --> 00:20:07,800 Speaker 1: just one billion dollar rob Yeah. Yeah, I'm not saying 347 00:20:07,800 --> 00:20:11,840 Speaker 1: it's a fishing system. I'm just saying I'm in supremely 348 00:20:11,960 --> 00:20:16,439 Speaker 1: lazy human being with tall ceilings. Uh. In nine Honda 349 00:20:16,480 --> 00:20:19,280 Speaker 1: introduced the P two robot, which was a self contained 350 00:20:19,359 --> 00:20:22,840 Speaker 1: robotic humanoid. It could walk and climb stairs. The P 351 00:20:23,040 --> 00:20:26,080 Speaker 1: three followed in n and in two thousand two, Honda 352 00:20:26,200 --> 00:20:30,679 Speaker 1: introduced My good buddy as Emo. As the first article 353 00:20:30,760 --> 00:20:34,880 Speaker 1: I wrote for How Stuff Works. As Works have an 354 00:20:34,880 --> 00:20:37,360 Speaker 1: episode on it. I have not done a full episode 355 00:20:37,440 --> 00:20:41,240 Speaker 1: on Asimo. I even was offered the opportunity to meet 356 00:20:41,280 --> 00:20:43,600 Speaker 1: as Amo when I first wrote the article, but it 357 00:20:43,640 --> 00:20:46,000 Speaker 1: would have meant having to travel to Disneyland to do it, 358 00:20:46,040 --> 00:20:48,200 Speaker 1: and at the time, How Stuff Works was not prepared 359 00:20:48,240 --> 00:20:52,440 Speaker 1: to do such a I went to Disneyland by myself, 360 00:20:52,880 --> 00:20:56,160 Speaker 1: uh well with my wife, and we went and saw 361 00:20:56,400 --> 00:20:59,000 Speaker 1: the Asimo production. And at the end of it, I 362 00:20:59,400 --> 00:21:02,240 Speaker 1: talked to of the Disney cast members and I said, yeah, 363 00:21:02,400 --> 00:21:04,399 Speaker 1: I wrote the article about how Asthma works for How 364 00:21:04,400 --> 00:21:06,560 Speaker 1: Stuff Works, and she said, hang on a minute. And 365 00:21:06,600 --> 00:21:08,520 Speaker 1: I got to meet Asthmo. And it was a man 366 00:21:08,560 --> 00:21:11,200 Speaker 1: in a suit, right. It was actually it was actually 367 00:21:11,240 --> 00:21:14,080 Speaker 1: a collection of cats that the duct taped together and 368 00:21:14,119 --> 00:21:17,280 Speaker 1: then plastic. No, it was a working robot. That's pretty 369 00:21:17,320 --> 00:21:20,119 Speaker 1: near how blown away? I was very much blown away. 370 00:21:20,160 --> 00:21:22,119 Speaker 1: It was cool seeing it up close. I mean, it 371 00:21:22,160 --> 00:21:25,120 Speaker 1: looks like a little tiny astronaut right because he's got 372 00:21:25,160 --> 00:21:27,360 Speaker 1: like the face plate especially. I love the people call 373 00:21:27,400 --> 00:21:30,520 Speaker 1: it he people give And I do this all the 374 00:21:30,560 --> 00:21:33,040 Speaker 1: time too. With robots, I'll sign a gender even though 375 00:21:33,080 --> 00:21:36,800 Speaker 1: technically many of them are specifically genderless. As amo is 376 00:21:36,960 --> 00:21:39,280 Speaker 1: is supposed to be genderless, but I often refer to 377 00:21:39,320 --> 00:21:40,960 Speaker 1: Asmo as a he as well. Well, you know why. 378 00:21:41,000 --> 00:21:43,280 Speaker 1: It's the shoulders, I would guess. Yeah, they go straight 379 00:21:43,280 --> 00:21:47,040 Speaker 1: across and and and far out. That's very masculine no 380 00:21:47,080 --> 00:21:49,920 Speaker 1: matter what. Yeah, and you need that, I would guess 381 00:21:49,920 --> 00:21:54,359 Speaker 1: you need shoulders in a humanoid robot with flexible arms. Well, 382 00:21:54,400 --> 00:21:57,840 Speaker 1: and also, I'm sure every single element of Asimo is 383 00:21:57,920 --> 00:22:00,960 Speaker 1: built with the balance in mind, because is the first 384 00:22:01,040 --> 00:22:05,640 Speaker 1: robot that can run. Yes, I've seen him run. It's gawky. Yeah, 385 00:22:05,800 --> 00:22:07,560 Speaker 1: it kind of look like someone who really needs to 386 00:22:07,560 --> 00:22:09,639 Speaker 1: get to the bathroom as a little bit of a 387 00:22:09,640 --> 00:22:12,879 Speaker 1: hoppy kind of run. But the the definition of run 388 00:22:12,920 --> 00:22:15,760 Speaker 1: here is that there are moments where both feet are 389 00:22:15,840 --> 00:22:19,359 Speaker 1: off the ground. So walking you always have one ft 390 00:22:19,359 --> 00:22:21,720 Speaker 1: in contact with the ground, and running both feet at 391 00:22:21,760 --> 00:22:25,600 Speaker 1: some point are out of contact. And that's a huge 392 00:22:25,720 --> 00:22:29,320 Speaker 1: deal for robotics, right. I mean, you have a machine 393 00:22:29,400 --> 00:22:32,240 Speaker 1: that completely separates itself from contact on the ground. It 394 00:22:32,280 --> 00:22:35,480 Speaker 1: has no propulsion to keep it upright, you know, it 395 00:22:35,520 --> 00:22:37,760 Speaker 1: doesn't have like propellers or jets or anything like that. 396 00:22:38,280 --> 00:22:40,920 Speaker 1: So you have to design it so it can it 397 00:22:41,000 --> 00:22:43,760 Speaker 1: can propel itself off the ground and then catch itself 398 00:22:43,760 --> 00:22:46,040 Speaker 1: when it comes back down without falling over. And that 399 00:22:46,200 --> 00:22:49,800 Speaker 1: is a non trivial challenge. No, it's an enormous challenge 400 00:22:49,840 --> 00:22:55,080 Speaker 1: that robots UM really kind of started to tackle lately. UM. 401 00:22:55,160 --> 00:22:57,199 Speaker 1: One of the ways that they've overcome it is with 402 00:22:57,320 --> 00:23:00,399 Speaker 1: rounded feet, which are very helpful in keeping balanced and 403 00:23:00,440 --> 00:23:04,359 Speaker 1: allowing it to run UM. But there's drawbacks to it 404 00:23:04,400 --> 00:23:07,920 Speaker 1: as well, Like the robot can't start itself, it also 405 00:23:08,000 --> 00:23:11,679 Speaker 1: can't stop, so it can't stop moving, which is not 406 00:23:11,880 --> 00:23:14,600 Speaker 1: something you want. Like, there's still some challenges there. Ahead 407 00:23:14,600 --> 00:23:17,600 Speaker 1: of the robotists who are learning to teach a robot 408 00:23:17,640 --> 00:23:20,240 Speaker 1: to walk, and even the ones that have taught robots 409 00:23:20,240 --> 00:23:23,720 Speaker 1: to walk UM, they typically can just walk over flat 410 00:23:23,720 --> 00:23:27,719 Speaker 1: surfaces with no obstacles. When they encounter stairs, there in trouble. 411 00:23:27,800 --> 00:23:30,679 Speaker 1: But then you have robots that know how to go upstairs, 412 00:23:31,119 --> 00:23:34,080 Speaker 1: but they can't walk on a flat surface. Eventually, all 413 00:23:34,119 --> 00:23:37,000 Speaker 1: this information, all this knowledge will be brought together and 414 00:23:37,080 --> 00:23:39,640 Speaker 1: you'll have a robot that can walk no problem. Right. 415 00:23:39,680 --> 00:23:43,399 Speaker 1: In fact, this kind of transitions nicely into those challenges 416 00:23:43,520 --> 00:23:48,520 Speaker 1: that face designers of humanoid robots and and locomotion is 417 00:23:49,040 --> 00:23:51,119 Speaker 1: the probably one of the top ones, at least from 418 00:23:51,160 --> 00:23:54,320 Speaker 1: the physical engineering side. For example, you know, as Amo 419 00:23:54,480 --> 00:23:57,600 Speaker 1: can can go up and downstairs, but that is a 420 00:23:57,640 --> 00:24:01,320 Speaker 1: little deceptive because as Amo has to be programmed to 421 00:24:01,680 --> 00:24:04,199 Speaker 1: go up or down the staircase and know exactly how 422 00:24:04,240 --> 00:24:07,640 Speaker 1: many stairs are involved. It's not so much it's not 423 00:24:08,000 --> 00:24:11,840 Speaker 1: a case of Asimo detecting a staircase and then uh, 424 00:24:11,880 --> 00:24:15,359 Speaker 1: and then navigating through oh up or down it. It's 425 00:24:15,920 --> 00:24:18,879 Speaker 1: the fact that all right now we're initiating your stair 426 00:24:19,000 --> 00:24:23,480 Speaker 1: climbing program. Yeah exactly. It's kind of like smoke and 427 00:24:23,480 --> 00:24:28,320 Speaker 1: mirrors robotics. Basically, it's at But that's you know, those 428 00:24:28,359 --> 00:24:31,119 Speaker 1: are the little no no pun intended, Those are the 429 00:24:31,119 --> 00:24:32,800 Speaker 1: little steps you have to take in order to get 430 00:24:32,800 --> 00:24:35,000 Speaker 1: to the destination. What do you mean, no pun intended. 431 00:24:35,040 --> 00:24:37,600 Speaker 1: I don't buy that at all. I started saying it 432 00:24:37,640 --> 00:24:40,439 Speaker 1: without thinking about it, and then I mean, but then 433 00:24:40,480 --> 00:24:42,000 Speaker 1: I did follow through with it, so I guess there 434 00:24:42,040 --> 00:24:45,040 Speaker 1: was some intention there at the end. But uh yeah. 435 00:24:45,119 --> 00:24:48,480 Speaker 1: They're also not very good at going across any kind 436 00:24:48,640 --> 00:24:54,720 Speaker 1: of uneven terrain, right, So, humanoid robots in particular find 437 00:24:54,720 --> 00:24:57,760 Speaker 1: it very difficult to maintain balance over anything that's not 438 00:24:58,359 --> 00:25:00,800 Speaker 1: either a flat surface or in the case some robots 439 00:25:00,840 --> 00:25:03,600 Speaker 1: that can go up or downstairs and stairs. So if 440 00:25:03,600 --> 00:25:07,200 Speaker 1: you're talking about like a sidewalk that is not completely even, 441 00:25:07,640 --> 00:25:09,960 Speaker 1: that would be enough to give a robot trouble because 442 00:25:10,080 --> 00:25:12,240 Speaker 1: it's going to try and put its foot down to 443 00:25:12,359 --> 00:25:15,399 Speaker 1: where it would believe the ground to be, and if 444 00:25:15,400 --> 00:25:19,240 Speaker 1: the ground is not exactly level, then that's yeah, because 445 00:25:19,320 --> 00:25:22,520 Speaker 1: they can't really catch their balance very well. There are 446 00:25:22,640 --> 00:25:26,479 Speaker 1: robots that can, but they are four legged. Yeah, you've 447 00:25:27,800 --> 00:25:30,800 Speaker 1: you've seen the Big Dog video, No, I saw the Army, 448 00:25:31,960 --> 00:25:34,960 Speaker 1: though it's very similar. Big Dog is essentially a robo 449 00:25:35,040 --> 00:25:39,280 Speaker 1: mule type development. It is a four legged robot that 450 00:25:39,480 --> 00:25:42,560 Speaker 1: is able to maintain its balance even when pushed. And 451 00:25:42,640 --> 00:25:47,080 Speaker 1: the famous video shows the robot dog the Big Dog 452 00:25:48,359 --> 00:25:52,240 Speaker 1: kind of jogging and then a guy just casually lifts 453 00:25:52,280 --> 00:25:56,160 Speaker 1: his leg up and kicks the robot dog like he 454 00:25:56,200 --> 00:25:58,240 Speaker 1: puts essentially puts the bomb of his foot against the 455 00:25:58,280 --> 00:26:01,199 Speaker 1: side of it and pushes really hard, tipping and you 456 00:26:01,240 --> 00:26:04,960 Speaker 1: see the big dog stumble. It actually stumbles and then 457 00:26:05,040 --> 00:26:09,320 Speaker 1: catches itself and then rights itself and continues on. And 458 00:26:09,520 --> 00:26:12,840 Speaker 1: almost everyone has an emotional reaction to this, like how 459 00:26:12,960 --> 00:26:16,760 Speaker 1: dare that evil man kick that poor defenseless robot. The 460 00:26:16,840 --> 00:26:22,520 Speaker 1: robot can't feel anything, but that robot is um gasoline powered. 461 00:26:23,119 --> 00:26:25,440 Speaker 1: So oh yeah, that's right. That's one of the big 462 00:26:25,440 --> 00:26:28,639 Speaker 1: dis keeping it inside you. Yeah, you wouldn't want to 463 00:26:28,680 --> 00:26:31,359 Speaker 1: have one of these indoors. No, you don't bring your 464 00:26:31,400 --> 00:26:35,240 Speaker 1: lam indoors. No, and uh. And the the pistons that 465 00:26:35,400 --> 00:26:38,480 Speaker 1: allow it to do this are quite loud, you know. 466 00:26:38,520 --> 00:26:41,560 Speaker 1: It's not not a subtle system at all. So a 467 00:26:41,600 --> 00:26:44,679 Speaker 1: lot of work has to go into creating better systems 468 00:26:44,800 --> 00:26:47,720 Speaker 1: for robots to maintain their balance in order for the 469 00:26:47,800 --> 00:26:50,840 Speaker 1: locomotion problem to really be solved. And again, I mean, 470 00:26:50,880 --> 00:26:53,480 Speaker 1: like anybody who's seen short Circuit knows that you can 471 00:26:53,480 --> 00:26:56,240 Speaker 1: build a robot like Johnny five with arms in the 472 00:26:56,320 --> 00:27:00,240 Speaker 1: head and a torso and then like traction um wed 473 00:27:00,520 --> 00:27:04,520 Speaker 1: shreds uh and it can go anywhere over terrain, it 474 00:27:04,520 --> 00:27:07,439 Speaker 1: can go up steps. Probably. The thing is is again 475 00:27:07,600 --> 00:27:10,760 Speaker 1: you have to remember when it comes to humanoid robots, 476 00:27:10,800 --> 00:27:12,960 Speaker 1: you're trying to make the robot that can adapt to 477 00:27:13,000 --> 00:27:16,240 Speaker 1: the human world. So if you had somebody like Johnny 478 00:27:16,280 --> 00:27:19,000 Speaker 1: five as your house butler or something, you you couldn't 479 00:27:20,359 --> 00:27:22,560 Speaker 1: You couldn't have an island in your kitchen, and who 480 00:27:22,600 --> 00:27:25,439 Speaker 1: doesn't love an island in their kitchen. Johnny five couldn't 481 00:27:25,440 --> 00:27:27,639 Speaker 1: maneuver around it because he's too wide. Yeah, you wouldn't 482 00:27:27,640 --> 00:27:29,840 Speaker 1: have You wouldn't be able to have any any space 483 00:27:29,840 --> 00:27:33,040 Speaker 1: that would be narrower than the robot's body exactly. That's 484 00:27:33,040 --> 00:27:36,240 Speaker 1: not what you want. Yeah, with humanoid robots, it wouldn't 485 00:27:36,240 --> 00:27:38,199 Speaker 1: work well. In my house. I've got a I've got 486 00:27:38,320 --> 00:27:44,120 Speaker 1: a a flat style house where there's three floors. Yeah, 487 00:27:44,560 --> 00:27:48,080 Speaker 1: it's like flat, like European flat. Uh, not flat as 488 00:27:48,119 --> 00:27:50,720 Speaker 1: in there's only one level. They're actually three of them, 489 00:27:51,160 --> 00:27:53,840 Speaker 1: four if you count the rooftop denck, so that counts. 490 00:27:53,960 --> 00:27:56,280 Speaker 1: It makes it. You know, any robot that would not 491 00:27:56,280 --> 00:27:59,600 Speaker 1: be able to navigate stairs easily would definitely have an issue, 492 00:27:59,600 --> 00:28:01,480 Speaker 1: which is the main reason why I don't have a rumba, 493 00:28:01,560 --> 00:28:03,000 Speaker 1: because I don't want to hear the sound of a 494 00:28:03,040 --> 00:28:07,520 Speaker 1: roomba going falling down a flight up stairs. Um, but 495 00:28:07,600 --> 00:28:10,679 Speaker 1: at any rate, Uh, that's a great point. Moving on 496 00:28:10,720 --> 00:28:14,800 Speaker 1: from locomotion, there's also dexterous manipulation. Yeah, I think we should. 497 00:28:14,840 --> 00:28:18,359 Speaker 1: I think that point bears repeating. What we just talked 498 00:28:18,359 --> 00:28:20,639 Speaker 1: about is locomotion. Yeah, and this is you and I 499 00:28:20,800 --> 00:28:24,240 Speaker 1: a couple of non robot experts talking about the problem 500 00:28:24,280 --> 00:28:29,119 Speaker 1: with locomotion. That's just one of myriad challenges facing humanoid 501 00:28:29,280 --> 00:28:33,720 Speaker 1: robotics designers. Yeah, yeah, exactly. It's It's one that's easy 502 00:28:33,800 --> 00:28:37,760 Speaker 1: to point to because it's something that we all, you know, 503 00:28:38,160 --> 00:28:43,400 Speaker 1: end up at least observing or participating in all the time. 504 00:28:43,640 --> 00:28:46,240 Speaker 1: Can we take it for granted? But then when you think, okay, well, 505 00:28:46,240 --> 00:28:47,800 Speaker 1: how do I make a machine that does that? You 506 00:28:47,840 --> 00:28:50,200 Speaker 1: start to realize this is you know, even if I 507 00:28:50,280 --> 00:28:53,600 Speaker 1: have a leg that has lots of different degrees of 508 00:28:53,600 --> 00:28:56,840 Speaker 1: freedom and points of articulation, I still have to design 509 00:28:56,880 --> 00:28:58,680 Speaker 1: the upper part of the robots so that it does 510 00:28:58,720 --> 00:29:01,840 Speaker 1: not unbalance the lower part, and if it does on balance, 511 00:29:01,920 --> 00:29:04,760 Speaker 1: it's able to catch itself. You know, some people just 512 00:29:04,840 --> 00:29:10,600 Speaker 1: describe walking as falling and catching yourself over and over again. Yeah, yeah, 513 00:29:10,680 --> 00:29:15,440 Speaker 1: they're walking right now. No, I I described walking as 514 00:29:15,480 --> 00:29:18,840 Speaker 1: something that other people do. I like to keep my 515 00:29:18,920 --> 00:29:21,240 Speaker 1: walking to a minimum. I thought you walked alot, Actually 516 00:29:21,280 --> 00:29:24,080 Speaker 1: I do. I just joke about being lazy. I think 517 00:29:24,520 --> 00:29:27,920 Speaker 1: moving forward, falling down and catching your balance every time 518 00:29:27,920 --> 00:29:32,480 Speaker 1: it's lurching. Yeah, that's well. As an Adams Family fan, 519 00:29:32,600 --> 00:29:36,160 Speaker 1: I'm okay with that. Yeah, but uh, Dexter's manipulation would 520 00:29:36,200 --> 00:29:40,160 Speaker 1: be the ability to pick up and manipulate objects. Now, 521 00:29:40,200 --> 00:29:42,120 Speaker 1: we're really good at that, we humans, You know, we 522 00:29:42,200 --> 00:29:45,400 Speaker 1: can we can feel an object and decide at that 523 00:29:45,480 --> 00:29:48,880 Speaker 1: point how to handle it, even if we've never encountered 524 00:29:48,880 --> 00:29:51,560 Speaker 1: that kind of object before. So if I encounter something 525 00:29:51,600 --> 00:29:54,600 Speaker 1: I've never seen before and i've I've ascertained it's safe 526 00:29:54,720 --> 00:29:56,800 Speaker 1: for me to touch it, I can touch it. I 527 00:29:56,840 --> 00:29:58,239 Speaker 1: can feel like, I can get a feel for how 528 00:29:58,280 --> 00:29:59,840 Speaker 1: heavy it is, I can get a feel for how 529 00:30:00,000 --> 00:30:03,360 Speaker 1: eloquant it might be, and then I can adjust on 530 00:30:03,440 --> 00:30:06,360 Speaker 1: the fly so that I can handle it appropriately. Well, 531 00:30:06,360 --> 00:30:08,520 Speaker 1: I'm not gonna hurt myself. I'm not gonna hurt the object. 532 00:30:09,040 --> 00:30:13,480 Speaker 1: Robots are not so good at that. Yeah, exactly, even 533 00:30:13,520 --> 00:30:15,920 Speaker 1: if they don't mean to right, Yeah, if the robots. 534 00:30:15,960 --> 00:30:18,240 Speaker 1: If the robots grip is too strong, it can break 535 00:30:18,280 --> 00:30:20,520 Speaker 1: the object. If it's too weak, the object slips from 536 00:30:20,520 --> 00:30:23,760 Speaker 1: its grip and it falls. Uh. And it may not 537 00:30:23,840 --> 00:30:28,200 Speaker 1: be able to distinguish between different types. So getting those 538 00:30:28,240 --> 00:30:31,880 Speaker 1: tactile sensors where a robot can tell how tightly it's 539 00:30:31,920 --> 00:30:35,360 Speaker 1: gripping something and how much pressure a particular object can 540 00:30:35,360 --> 00:30:39,080 Speaker 1: take before you've reached the failure point is a big deal. Now, 541 00:30:39,120 --> 00:30:41,440 Speaker 1: this is also a big deal for just making robots 542 00:30:41,520 --> 00:30:44,480 Speaker 1: safe for humans to be around. It is a big deal. 543 00:30:44,560 --> 00:30:49,000 Speaker 1: You know that the first fatality by robot occurred um 544 00:30:49,040 --> 00:30:52,240 Speaker 1: at the business end of a robotic arm in a 545 00:30:52,560 --> 00:30:55,719 Speaker 1: flat rock, Michigan in nine a man named Robert Williams 546 00:30:55,720 --> 00:30:59,440 Speaker 1: who's working on a forward line. Yeah, his robot arm 547 00:30:59,560 --> 00:31:02,360 Speaker 1: was moving a little slow for his taste in getting 548 00:31:02,400 --> 00:31:05,200 Speaker 1: supplies down, so he climbed up to where the supplies were. 549 00:31:05,480 --> 00:31:07,760 Speaker 1: The robot arms suddenly sped up and hit him in 550 00:31:07,800 --> 00:31:11,320 Speaker 1: the head and killed him instantly. Wow. Yeah, I've I've 551 00:31:11,640 --> 00:31:15,239 Speaker 1: had a chance to see some of these industrial robots. Uh. 552 00:31:15,320 --> 00:31:17,680 Speaker 1: And I would say up close, but you can't because 553 00:31:18,120 --> 00:31:22,520 Speaker 1: because of instances like that. Industrial robots usually have lots 554 00:31:22,560 --> 00:31:27,200 Speaker 1: of of safety barriers around them because it's not safe 555 00:31:27,200 --> 00:31:31,160 Speaker 1: to be near those robots when they're in operation. They 556 00:31:31,200 --> 00:31:33,880 Speaker 1: they can't react exactly, So you leave it up to 557 00:31:33,920 --> 00:31:36,480 Speaker 1: the humans to stay away from the robots because the road. 558 00:31:36,480 --> 00:31:38,480 Speaker 1: We haven't gotten to the point where the robots no, 559 00:31:38,600 --> 00:31:40,880 Speaker 1: did not crush you or hit you in the head. Right, Yeah, 560 00:31:40,920 --> 00:31:42,760 Speaker 1: I got the robots fall. I'm looking forward to that 561 00:31:42,840 --> 00:31:45,040 Speaker 1: day when they figure out not to crush me. Yes, 562 00:31:45,680 --> 00:31:49,000 Speaker 1: it's been pretty lousy days so far. Uh. Yeah, when 563 00:31:49,000 --> 00:31:52,840 Speaker 1: I when I toured the Georgia Tech Robotics Lab, they 564 00:31:52,920 --> 00:31:55,760 Speaker 1: talked specifically about this. This is a real challenge having 565 00:31:55,840 --> 00:31:58,600 Speaker 1: robots recognize and react in a way that's going to 566 00:31:58,680 --> 00:32:03,440 Speaker 1: be safe around humans. And uh, but dexterous manipulation is 567 00:32:03,440 --> 00:32:05,760 Speaker 1: only that's only a part of dexterous manipulation. Obviously. The 568 00:32:05,840 --> 00:32:09,160 Speaker 1: rest of it is again that object recognition and handling 569 00:32:09,240 --> 00:32:11,120 Speaker 1: so that you're not destroying whatever it is you're trying 570 00:32:11,120 --> 00:32:15,680 Speaker 1: to pick up. Um. Another big challenge in designing robots 571 00:32:15,800 --> 00:32:19,680 Speaker 1: in general, not just humanoid robots, is just the the perception, 572 00:32:19,720 --> 00:32:24,200 Speaker 1: the sensory perception of the robot. Yeah, so you know, 573 00:32:24,280 --> 00:32:28,280 Speaker 1: whether it's optical systems like actual cameras in the place 574 00:32:28,320 --> 00:32:31,520 Speaker 1: of eyes, or infrared so that you can see even 575 00:32:31,520 --> 00:32:35,440 Speaker 1: in low light situations, radar, light ar, I mean, there's 576 00:32:35,480 --> 00:32:40,200 Speaker 1: tons of different ways of sensing. Yep, there's uh, there's 577 00:32:40,640 --> 00:32:42,719 Speaker 1: you know sensing obviously, it's not just site. Then you 578 00:32:42,760 --> 00:32:45,520 Speaker 1: have to have the sound. That's a really tricky one 579 00:32:45,560 --> 00:32:49,600 Speaker 1: actually because for us humans, we we can kind of 580 00:32:49,680 --> 00:32:52,600 Speaker 1: zero in on what's important. Right, So if we're if 581 00:32:52,640 --> 00:32:54,560 Speaker 1: you and I were at a party, which you know, 582 00:32:54,960 --> 00:32:57,720 Speaker 1: someone made a mistake and invited me, Uh, we could 583 00:32:57,720 --> 00:33:01,440 Speaker 1: have a conversation and be able to carry that conversation 584 00:33:01,440 --> 00:33:04,400 Speaker 1: on even within the context of a big, bustling party 585 00:33:04,720 --> 00:33:06,920 Speaker 1: because we can focus on what the other person is saying. 586 00:33:06,960 --> 00:33:10,360 Speaker 1: It's called latent inhibition. Yeah. So when you don't have that, 587 00:33:10,360 --> 00:33:14,280 Speaker 1: that schizophrenia, Yeah, you you can't separate the the signal 588 00:33:14,360 --> 00:33:17,200 Speaker 1: from the noise, and everything either becomes noise or everything 589 00:33:17,240 --> 00:33:20,720 Speaker 1: becomes signal. Um. So for a robot that might for example, 590 00:33:20,880 --> 00:33:25,120 Speaker 1: require verbal commands, that's really tricky. What if you have 591 00:33:25,160 --> 00:33:30,360 Speaker 1: the television on and someone's saying something on TV and uh, 592 00:33:30,640 --> 00:33:33,040 Speaker 1: you are trying to get your robot to do something 593 00:33:33,120 --> 00:33:34,880 Speaker 1: and it's not quite sure what to do because it's 594 00:33:34,880 --> 00:33:38,560 Speaker 1: hearing these different commands and isn't sure who to obey. Yeah, 595 00:33:38,640 --> 00:33:42,200 Speaker 1: you are the person pitching the bacon bowl right right exactly. 596 00:33:42,360 --> 00:33:44,040 Speaker 1: Let's say that you are, you know, trying to get 597 00:33:44,040 --> 00:33:47,040 Speaker 1: some help in the kitchen, but it just keeps hearing U, 598 00:33:47,240 --> 00:33:49,520 Speaker 1: C S I Miami and say kill Billy. And then 599 00:33:49,600 --> 00:33:51,960 Speaker 1: next thing you know, you're like, you're just desperately trying 600 00:33:51,960 --> 00:33:54,960 Speaker 1: to kill the robot. Please don't kill Billy. Um. Yeah, 601 00:33:56,040 --> 00:33:59,120 Speaker 1: well that's a silly example. It's a real problem. Uh. 602 00:33:59,160 --> 00:34:02,960 Speaker 1: And then there's the tactile the responses, the tactile sensors, 603 00:34:03,000 --> 00:34:05,800 Speaker 1: like the making sure you don't crush something that's delicate, 604 00:34:05,840 --> 00:34:10,200 Speaker 1: that that falls into perception. Smell can also fall into perception. 605 00:34:10,719 --> 00:34:13,279 Speaker 1: You might want to have humanoid robots that work in 606 00:34:13,360 --> 00:34:17,480 Speaker 1: areas where the humanoid robot can alert humans to the 607 00:34:17,480 --> 00:34:20,160 Speaker 1: presence of things that might be toxic. You know, this 608 00:34:20,239 --> 00:34:23,840 Speaker 1: isn't necessarily just the robot butler we're talking about. This 609 00:34:23,920 --> 00:34:27,319 Speaker 1: could be robots that work in areas that might be 610 00:34:27,719 --> 00:34:31,239 Speaker 1: dangerous for humans, and that would be an important element too. 611 00:34:31,719 --> 00:34:34,080 Speaker 1: It's not a robot. But NASA already has a sensor 612 00:34:34,320 --> 00:34:38,040 Speaker 1: that senses things like ammonia or smoke. It can actually 613 00:34:38,280 --> 00:34:45,799 Speaker 1: sense smoke artificially smell smoke before the fires actually started ignited. Interesting, yeah, 614 00:34:45,880 --> 00:34:48,839 Speaker 1: because you know it's such a dangerous proposition, right, yes, 615 00:34:48,920 --> 00:34:52,759 Speaker 1: clearly for for anything NASA related. But they can also 616 00:34:52,840 --> 00:34:55,160 Speaker 1: sense ammonia because you know a lot of the refrigeration 617 00:34:55,200 --> 00:34:59,000 Speaker 1: systems run on ammonia and you can't have aneumonia leak 618 00:34:59,239 --> 00:35:02,680 Speaker 1: on the space right right, And I mean the same 619 00:35:02,680 --> 00:35:04,839 Speaker 1: thing is true for I mean I've heard of of 620 00:35:05,520 --> 00:35:09,880 Speaker 1: robots that are used in in mining operations, which you know, 621 00:35:09,920 --> 00:35:12,080 Speaker 1: if you come upon a pocket of natural gas that 622 00:35:12,120 --> 00:35:14,640 Speaker 1: can be a real danger that sort of stuff. Then 623 00:35:14,680 --> 00:35:18,440 Speaker 1: you've got the the back end of the sensory perception. 624 00:35:18,680 --> 00:35:22,360 Speaker 1: That's where you have the actual interpretation of the data. 625 00:35:22,840 --> 00:35:25,120 Speaker 1: Where that's a big one. It's huge because not only 626 00:35:25,120 --> 00:35:27,200 Speaker 1: do you need to have a robot that can have 627 00:35:27,280 --> 00:35:31,720 Speaker 1: that has binocular vision so it has a depth of field, right, um, 628 00:35:31,760 --> 00:35:35,480 Speaker 1: it also has to know what it's doing, what what 629 00:35:35,520 --> 00:35:40,080 Speaker 1: the information means, and how to apply it to adapted changes, 630 00:35:40,360 --> 00:35:43,680 Speaker 1: right right. So So if I were to show you 631 00:35:43,840 --> 00:35:46,960 Speaker 1: josh a series of pictures of various types of dogs. 632 00:35:47,520 --> 00:35:50,400 Speaker 1: You would very quickly pick up on the the things 633 00:35:50,480 --> 00:35:53,880 Speaker 1: that mean mean dog like, you would understand the concept 634 00:35:53,920 --> 00:35:59,120 Speaker 1: of dog pretty quickly. Robots and various other computers machines, 635 00:35:59,600 --> 00:36:02,400 Speaker 1: they have a lot harder problem with this. If whatever 636 00:36:02,480 --> 00:36:06,239 Speaker 1: they're looking at doesn't exactly match the parameters of the example. 637 00:36:06,880 --> 00:36:10,520 Speaker 1: It's very difficult for a machine to extrapolate and say, oh, 638 00:36:10,640 --> 00:36:13,359 Speaker 1: this other thing I'm looking at relates to this thing 639 00:36:13,440 --> 00:36:17,719 Speaker 1: I know, even though the two examples don't aren't identical. 640 00:36:18,360 --> 00:36:20,400 Speaker 1: So the same thing could be true for any object. 641 00:36:20,520 --> 00:36:23,560 Speaker 1: Let's let's just use a coffee mug, and let's say 642 00:36:23,640 --> 00:36:27,560 Speaker 1: that you use a plain white coffee mug of average 643 00:36:27,640 --> 00:36:32,200 Speaker 1: size as the example for the robot, and then the 644 00:36:32,280 --> 00:36:36,440 Speaker 1: robot encounters a larger blue coffee mug and the handle 645 00:36:36,480 --> 00:36:39,200 Speaker 1: has turned the other way. The robot might be completely 646 00:36:39,200 --> 00:36:42,600 Speaker 1: befuddled by this. So this is a real problem in 647 00:36:42,760 --> 00:36:47,960 Speaker 1: artificial intelligence. Is object identification, so that a robot knows 648 00:36:48,000 --> 00:36:51,480 Speaker 1: what it's looking at and also understands the context that 649 00:36:51,480 --> 00:36:56,239 Speaker 1: that object fills within the environment. So it's not just that, oh, 650 00:36:56,280 --> 00:36:59,280 Speaker 1: that's a mug, it's oh, that's a mug. A mug 651 00:36:59,320 --> 00:37:01,640 Speaker 1: is a container. I can put things into that mug. 652 00:37:01,800 --> 00:37:03,279 Speaker 1: You are the things that can go in the mug. 653 00:37:03,320 --> 00:37:05,319 Speaker 1: Here are the things that absolutely should not go in 654 00:37:05,320 --> 00:37:07,840 Speaker 1: the mug, like Billy. Those are the kind of things 655 00:37:07,880 --> 00:37:11,160 Speaker 1: that yeah, Billy, well you know we're gonna need another Billy. Um, 656 00:37:11,640 --> 00:37:14,239 Speaker 1: we'll make a robot Billy. Yeah, which in which case 657 00:37:14,280 --> 00:37:16,560 Speaker 1: you can just turn those suckers out right, mass production 658 00:37:16,600 --> 00:37:20,520 Speaker 1: of Billy. But yeah, artificial intelligence an enormous problem. And that, 659 00:37:20,600 --> 00:37:22,840 Speaker 1: of course is not just with robotics. That's that's a 660 00:37:22,920 --> 00:37:27,200 Speaker 1: field unto itself, and robotics is just one branch that 661 00:37:27,600 --> 00:37:31,920 Speaker 1: relies upon artificial intelligence. And it's from what I came across, 662 00:37:31,960 --> 00:37:34,120 Speaker 1: it looked like just out of the gate, I guess 663 00:37:34,120 --> 00:37:37,440 Speaker 1: that was said a university, they tried to build a 664 00:37:37,560 --> 00:37:41,600 Speaker 1: robot that was just like high functioning, Yeah, and they realized, like, 665 00:37:41,800 --> 00:37:44,960 Speaker 1: we have no idea what we're doing. Yeah, that that 666 00:37:45,000 --> 00:37:48,840 Speaker 1: waybot one was able to converse at the level or 667 00:37:49,000 --> 00:37:53,520 Speaker 1: was able to have a a a cognitive function equivalent 668 00:37:53,560 --> 00:37:56,320 Speaker 1: to a one and a half year old person, which 669 00:37:56,560 --> 00:37:59,160 Speaker 1: I have to say, when you're talking right out of 670 00:37:59,160 --> 00:38:02,879 Speaker 1: the gate, Yeah, impressive, very impressive, because we're not that 671 00:38:02,960 --> 00:38:06,840 Speaker 1: much further along now. But what they found from making 672 00:38:06,840 --> 00:38:10,960 Speaker 1: waybot one was, Okay, this is way more difficult than 673 00:38:10,960 --> 00:38:15,360 Speaker 1: we thought. You can't just program every kind of coffee 674 00:38:15,360 --> 00:38:18,480 Speaker 1: cup in the world, and even if you could, then 675 00:38:18,520 --> 00:38:21,080 Speaker 1: you also have to program every kind of table and 676 00:38:21,080 --> 00:38:23,680 Speaker 1: every kind of light. And we need to come at 677 00:38:23,680 --> 00:38:27,319 Speaker 1: this in a different way. And so they realized number one, 678 00:38:27,560 --> 00:38:32,480 Speaker 1: humans are extraordinarily more complex than we thought before. And 679 00:38:32,520 --> 00:38:35,640 Speaker 1: then number two, humans make a pretty good model for 680 00:38:35,719 --> 00:38:39,480 Speaker 1: a humanoid robot in the realm of things like perception 681 00:38:40,000 --> 00:38:45,520 Speaker 1: and UM information systems and uh learning. So they went 682 00:38:45,640 --> 00:38:52,759 Speaker 1: to these different these different disciplines like neurobiology, neurology, um psychology, 683 00:38:52,800 --> 00:38:55,279 Speaker 1: and they said, what can we learn from you guys 684 00:38:55,320 --> 00:38:58,440 Speaker 1: about how humans do this that we can apply to robots. 685 00:38:58,600 --> 00:39:03,759 Speaker 1: And since they started taking those steps, it seems like, uh, 686 00:39:04,160 --> 00:39:07,920 Speaker 1: humanoid robotics has gotten it's it's footing a little more. Yeah, 687 00:39:08,000 --> 00:39:10,719 Speaker 1: and we're seeing so many developments in other areas of 688 00:39:10,760 --> 00:39:14,120 Speaker 1: AI that are really promising. I always bring up IBM 689 00:39:14,160 --> 00:39:18,880 Speaker 1: S Watson because it's natural language recognition was phenomenal. The 690 00:39:18,920 --> 00:39:23,200 Speaker 1: ability for it to parse clues in jeopardy and come 691 00:39:23,280 --> 00:39:26,600 Speaker 1: up with the appropriate answer, knowing that Jeopardy those clues 692 00:39:26,640 --> 00:39:30,200 Speaker 1: are not always straightforward. Uh. And it again illustrates the 693 00:39:30,239 --> 00:39:34,759 Speaker 1: complexity that we humans navigate without much trouble because this 694 00:39:34,840 --> 00:39:38,080 Speaker 1: is the world we've created. But then we realize if 695 00:39:38,120 --> 00:39:40,480 Speaker 1: we make a machine that's mostly when you get down 696 00:39:40,480 --> 00:39:43,480 Speaker 1: to it based on yes or no, a one or 697 00:39:43,520 --> 00:39:46,360 Speaker 1: a zero, true or false, and you're trying to build 698 00:39:46,400 --> 00:39:49,719 Speaker 1: complex behaviors off of something that is incredibly simple. When 699 00:39:49,719 --> 00:39:52,719 Speaker 1: you boil it down to its basic element, that's where 700 00:39:52,760 --> 00:39:54,800 Speaker 1: you're like, oh, this is this is gonna require a 701 00:39:54,840 --> 00:39:57,680 Speaker 1: lot of work. I mean, IBMS Watson was an enormous 702 00:39:57,719 --> 00:40:02,799 Speaker 1: machine with with thoul of microprocessors just so it could 703 00:40:02,800 --> 00:40:07,080 Speaker 1: be able to play Jeopardy. That's a very specific function too. 704 00:40:07,520 --> 00:40:18,440 Speaker 1: We'll be right back after this next break. So, creating 705 00:40:18,480 --> 00:40:22,279 Speaker 1: a robot that is able to navigate and interact with 706 00:40:22,320 --> 00:40:24,800 Speaker 1: a human environment and be able to interact with humans 707 00:40:24,840 --> 00:40:27,600 Speaker 1: in a way that makes sense is a big challenge. Also, 708 00:40:27,680 --> 00:40:31,120 Speaker 1: just the way that a robot would socialize with humans 709 00:40:31,280 --> 00:40:33,480 Speaker 1: is a huge challenge. How do how do you make 710 00:40:33,480 --> 00:40:38,000 Speaker 1: a robot that is able to respond to commands and 711 00:40:38,200 --> 00:40:41,400 Speaker 1: cues in an appropriate way? Uh? An appropriate way is 712 00:40:41,440 --> 00:40:44,360 Speaker 1: the key there, because there are humans are pretty complex 713 00:40:44,360 --> 00:40:46,839 Speaker 1: and we can be very subtle in many ways. Yeah, 714 00:40:47,000 --> 00:40:52,399 Speaker 1: we speak unplainly, we use sarcasm, um, we we uh yeah, 715 00:40:52,480 --> 00:40:57,080 Speaker 1: we use a lot of gestures rather than just words. Yep. 716 00:40:57,280 --> 00:40:59,759 Speaker 1: There's a lot that goes into human communication that, if 717 00:40:59,800 --> 00:41:03,080 Speaker 1: you are a human, is pretty much natural, especially I 718 00:41:03,080 --> 00:41:05,680 Speaker 1: mean if you're a human within that particular culture and 719 00:41:05,719 --> 00:41:08,600 Speaker 1: you're familiar with that culture, Because anyone who has traveled 720 00:41:08,600 --> 00:41:11,839 Speaker 1: extensively knows there are cultures where things that would be 721 00:41:12,000 --> 00:41:15,600 Speaker 1: commonplace at home are very different in the place where 722 00:41:15,640 --> 00:41:17,640 Speaker 1: you happen to be, right then, and it may be 723 00:41:17,800 --> 00:41:22,240 Speaker 1: that something that is completely innocent at home is an uh, 724 00:41:22,280 --> 00:41:25,200 Speaker 1: offensive gesture in the place where you are. Now, we'll 725 00:41:25,239 --> 00:41:29,520 Speaker 1: imagine a robot that is not programmed to handle these 726 00:41:29,560 --> 00:41:38,120 Speaker 1: kind of subtle uh communication methods, and yeah, he doesn't know, 727 00:41:38,480 --> 00:41:41,560 Speaker 1: he's just he's just doing as he was programmed. But 728 00:41:41,680 --> 00:41:44,319 Speaker 1: even even beyond that, something that you brought up in 729 00:41:44,360 --> 00:41:48,120 Speaker 1: our research when we were planning this was the Uncanny Valley. Yeah, 730 00:41:48,160 --> 00:41:50,680 Speaker 1: that's it's a big one. Yeah, I've never read that 731 00:41:50,680 --> 00:41:54,000 Speaker 1: paper before, and I'm glad I did. It's really interesting, right, Yeah. 732 00:41:54,040 --> 00:41:56,240 Speaker 1: So the uncanny Valley, for those who are not familiar 733 00:41:56,239 --> 00:41:59,359 Speaker 1: with the term, uh it's it describes when we start 734 00:41:59,440 --> 00:42:04,680 Speaker 1: to appre coach artificial humans that look almost, but not 735 00:42:04,800 --> 00:42:07,759 Speaker 1: quite like real humans. And and by look, I don't 736 00:42:07,800 --> 00:42:11,160 Speaker 1: necessarily just mean the physical appearance. I also mean their behaviors, 737 00:42:11,160 --> 00:42:14,239 Speaker 1: their movements. So if you are if you were to 738 00:42:14,360 --> 00:42:18,279 Speaker 1: look out and see a figure that from the from 739 00:42:18,280 --> 00:42:20,560 Speaker 1: a distance looked like it was a human figure, and 740 00:42:20,600 --> 00:42:22,880 Speaker 1: you start walking towards it just thinking this is another person, 741 00:42:23,040 --> 00:42:25,400 Speaker 1: and then they start moving in a very herky jerky motion, 742 00:42:25,719 --> 00:42:30,200 Speaker 1: very mechanical motion, then you're likely going to have a 743 00:42:30,280 --> 00:42:34,799 Speaker 1: negative emotional response. Um often revulsion is one of the 744 00:42:34,800 --> 00:42:39,560 Speaker 1: words used us. Yeah, I mean I remember the C 745 00:42:39,719 --> 00:42:43,920 Speaker 1: G I movies that would that were almost to the 746 00:42:44,000 --> 00:42:47,840 Speaker 1: point of photo realism, where they look like people for 747 00:42:47,880 --> 00:42:50,399 Speaker 1: the most part, but there's not something's just not quite 748 00:42:50,440 --> 00:42:56,800 Speaker 1: right with the eyes. There's a good example, and Polar 749 00:42:56,800 --> 00:43:00,120 Speaker 1: express polar expresses the way everyone very well because the 750 00:43:00,239 --> 00:43:02,359 Speaker 1: uncanny Valley, that's what they blame it on. Yeah, and 751 00:43:02,440 --> 00:43:05,120 Speaker 1: the same thing applies to robots. So in fact, I 752 00:43:05,160 --> 00:43:08,840 Speaker 1: saw a robot that was really disturbing to me, it 753 00:43:08,880 --> 00:43:13,719 Speaker 1: was really an art exhibit, an installation, and um it 754 00:43:13,800 --> 00:43:18,239 Speaker 1: was a robot of Confucius in a cell filled with monkeys, 755 00:43:19,080 --> 00:43:22,720 Speaker 1: and uh they're live monkeys, real monkeys, and the robot 756 00:43:22,800 --> 00:43:28,239 Speaker 1: would just thrash around wildly. It was it was the 757 00:43:28,239 --> 00:43:30,920 Speaker 1: stuff of nightmares that I'll show it to you after 758 00:43:30,960 --> 00:43:35,000 Speaker 1: the show. Yeah. So these are all big challenges, and 759 00:43:35,040 --> 00:43:37,440 Speaker 1: some of them are going to be harder for us 760 00:43:37,440 --> 00:43:39,920 Speaker 1: than others. It may be that the engineering challenges of 761 00:43:40,000 --> 00:43:43,879 Speaker 1: locomotion are solved well before we ever get a real 762 00:43:43,920 --> 00:43:46,520 Speaker 1: grip on all the artificial intelligence problems. Or it could 763 00:43:46,520 --> 00:43:50,120 Speaker 1: be the other way around. Uh we but it is multidisciplinary. 764 00:43:50,320 --> 00:43:53,720 Speaker 1: It's a big, big issue. So there are some people 765 00:43:53,719 --> 00:43:57,400 Speaker 1: who argue for humanoid robots, there are people who argue 766 00:43:57,440 --> 00:44:02,200 Speaker 1: against humanoid robots. Where do you, um, I think. I 767 00:44:02,200 --> 00:44:06,240 Speaker 1: think research and development with humanoid robots is important because 768 00:44:06,280 --> 00:44:09,680 Speaker 1: by having the goal of creating a humanoid robot, you 769 00:44:09,800 --> 00:44:13,320 Speaker 1: drive the research and development process. You have a specific 770 00:44:13,320 --> 00:44:15,640 Speaker 1: goal in mind, and in order to achieve that goal, 771 00:44:15,880 --> 00:44:18,200 Speaker 1: you know what sort of problems you have to solve. 772 00:44:18,760 --> 00:44:21,880 Speaker 1: And even if we never enter a future where humanoid 773 00:44:21,960 --> 00:44:27,080 Speaker 1: robots are a common thing, even if they are mostly 774 00:44:27,200 --> 00:44:31,920 Speaker 1: used as something in an exhibition or uh for pr 775 00:44:32,040 --> 00:44:34,439 Speaker 1: or whatever. Even if that's the only use for them, 776 00:44:34,920 --> 00:44:38,359 Speaker 1: We're going to benefit from the research and development of 777 00:44:38,440 --> 00:44:42,399 Speaker 1: making that possible in ways we can't anticipate. Well. Yeah, 778 00:44:42,400 --> 00:44:45,560 Speaker 1: and then the more we get into humanoid robotics, the 779 00:44:45,600 --> 00:44:50,000 Speaker 1: more we understand humans, which is pretty much the only 780 00:44:50,360 --> 00:44:54,120 Speaker 1: argument I've seen that stands up in favor of doing 781 00:44:54,200 --> 00:44:57,640 Speaker 1: humanoid robots. Yeah, because it's it's expensive, and it's hard. 782 00:44:57,960 --> 00:45:00,719 Speaker 1: I mean, it's it's really a different cult problem, and 783 00:45:00,760 --> 00:45:03,239 Speaker 1: it's and and to build a humanoid robot something that 784 00:45:03,360 --> 00:45:07,960 Speaker 1: is capable of being a general purpose robot, it's you know, 785 00:45:08,480 --> 00:45:10,680 Speaker 1: it's hard to anticipate all the things you're going to 786 00:45:10,719 --> 00:45:12,960 Speaker 1: need to be able to do. If you're talking general 787 00:45:13,040 --> 00:45:16,279 Speaker 1: purpose and adaptable, that's really tough. I mean, we didn't 788 00:45:16,280 --> 00:45:19,239 Speaker 1: even talk about the adaptability problem very much. We talked 789 00:45:19,239 --> 00:45:21,719 Speaker 1: a little bit about a robot capable of learning from 790 00:45:21,719 --> 00:45:25,399 Speaker 1: other people, which I find fascinating. Um, you know, that's 791 00:45:25,440 --> 00:45:29,640 Speaker 1: one way, just watching humans and then mimicking humans, that's 792 00:45:29,680 --> 00:45:34,759 Speaker 1: one way of robot learning. There's also, um the way 793 00:45:34,760 --> 00:45:38,319 Speaker 1: where it's controlled through virtual reality by a human and 794 00:45:38,360 --> 00:45:41,040 Speaker 1: it just kind of logs the motions the humans making 795 00:45:41,120 --> 00:45:43,320 Speaker 1: it do. Like there's a NASA has a robot not 796 00:45:43,520 --> 00:45:48,080 Speaker 1: that learns like that. So I think I think there's 797 00:45:48,120 --> 00:45:52,960 Speaker 1: a lot of benefit to investigating artificial intelligence. Like you 798 00:45:53,000 --> 00:45:56,239 Speaker 1: want your nest at home to learn so you don't 799 00:45:56,239 --> 00:45:59,440 Speaker 1: have to keep adjusting the thermostat that counts. That's the 800 00:45:59,640 --> 00:46:05,120 Speaker 1: that's sheen learning to me. The big argument against having 801 00:46:05,400 --> 00:46:10,120 Speaker 1: humanoid robots, and the quagmire that seems to begin, is sociability. 802 00:46:10,719 --> 00:46:13,239 Speaker 1: That seems to be the whole reason anybody wants a 803 00:46:13,320 --> 00:46:15,960 Speaker 1: humanoid robot, because you can make you know, you have 804 00:46:16,000 --> 00:46:20,160 Speaker 1: a ruma vacuums. You can make a driverless car um 805 00:46:20,200 --> 00:46:23,799 Speaker 1: as Olivia Solan wrote and wired a couple of years back. Um, 806 00:46:25,040 --> 00:46:26,520 Speaker 1: you know, why why do you have to make a 807 00:46:26,600 --> 00:46:28,839 Speaker 1: robot butler to park the car? Just make a car 808 00:46:28,920 --> 00:46:32,160 Speaker 1: that parks itself. And it seems like that's where we're 809 00:46:32,200 --> 00:46:35,800 Speaker 1: going right now. Um. So when you add this extra 810 00:46:35,880 --> 00:46:41,080 Speaker 1: layer of humanoid, you add all of this additional problems 811 00:46:41,160 --> 00:46:44,600 Speaker 1: and troubles and redundancies, like like, for example, if you're 812 00:46:44,600 --> 00:46:48,279 Speaker 1: gonna make a humanoid robot that throws a ball, this 813 00:46:48,440 --> 00:46:52,759 Speaker 1: this humanoid robot to appear real needs to have a 814 00:46:52,760 --> 00:46:56,040 Speaker 1: little bit of follow through. But as far as the 815 00:46:56,120 --> 00:46:58,279 Speaker 1: robot the machine is concerned, it can throw the ball 816 00:46:58,360 --> 00:47:01,600 Speaker 1: and just stop where the releases. It doesn't need to 817 00:47:01,600 --> 00:47:04,560 Speaker 1: go anymore. But it's gonna look weird and robotic if 818 00:47:04,560 --> 00:47:06,960 Speaker 1: you want to get past that uncanny valley, which is 819 00:47:07,000 --> 00:47:10,080 Speaker 1: another problem. Um, the thing has to have followed through. 820 00:47:10,400 --> 00:47:13,799 Speaker 1: That's totally unnecessary. You can make a robot that can 821 00:47:13,840 --> 00:47:16,800 Speaker 1: throw a ball and the goal is to throw the ball. 822 00:47:17,520 --> 00:47:19,719 Speaker 1: You don't have to add the follow through, but you 823 00:47:19,840 --> 00:47:23,840 Speaker 1: do when you're making it a sociable humanoid robots. So 824 00:47:23,880 --> 00:47:27,280 Speaker 1: it seems like that's the path that will lead everyone 825 00:47:27,320 --> 00:47:30,160 Speaker 1: is straight that I don't get. Yeah, I I like 826 00:47:30,320 --> 00:47:34,359 Speaker 1: the idea of designing robots for specific tasks because you 827 00:47:34,400 --> 00:47:37,239 Speaker 1: can really focus on getting the task done. So there 828 00:47:37,320 --> 00:47:39,960 Speaker 1: I see this as two separate branches. I see the 829 00:47:40,000 --> 00:47:43,799 Speaker 1: branch of developing the humanoid robot as pushing forward a 830 00:47:43,880 --> 00:47:47,320 Speaker 1: lot of different areas of thought that could be applied 831 00:47:47,360 --> 00:47:51,319 Speaker 1: in multiple disciplines, so that will benefit from that. I 832 00:47:51,400 --> 00:47:55,600 Speaker 1: see the development of robots as unitaskers as being important 833 00:47:55,600 --> 00:47:58,799 Speaker 1: to actually handle the jobs that are the three D S. 834 00:47:58,920 --> 00:48:02,319 Speaker 1: That's dirty, different called dangerous, all right, So those are 835 00:48:03,760 --> 00:48:07,120 Speaker 1: the jobs that maybe they revolve a lot of repetition, 836 00:48:07,560 --> 00:48:10,799 Speaker 1: which can cause injury over time, or it can lead 837 00:48:10,840 --> 00:48:13,759 Speaker 1: to mistakes because you've done the same task so many 838 00:48:13,760 --> 00:48:16,240 Speaker 1: times that you start to kind of zone out. Robots 839 00:48:16,280 --> 00:48:19,800 Speaker 1: will never zone out. Um, if it's a dirty job 840 00:48:19,880 --> 00:48:23,120 Speaker 1: where it's something that's undesirable by people, robots don't care, 841 00:48:23,360 --> 00:48:27,000 Speaker 1: they'll do that. Or if it's dangerous, if it's bomb disposal, 842 00:48:27,280 --> 00:48:31,399 Speaker 1: or if it's something like the Mars Curiosity rover. These 843 00:48:31,400 --> 00:48:34,759 Speaker 1: are dangerous jobs that you wouldn't necessarily want to put 844 00:48:34,760 --> 00:48:39,120 Speaker 1: a human into if you had the alternative. So all 845 00:48:39,160 --> 00:48:42,120 Speaker 1: of these things, those that's where robots really makes sense 846 00:48:42,120 --> 00:48:44,600 Speaker 1: to me um to to go to the places that 847 00:48:44,680 --> 00:48:46,880 Speaker 1: are difficult for us to go to. Maybe that's you know, 848 00:48:47,160 --> 00:48:51,080 Speaker 1: deep sea exploration, space exploration, that kind of thing, or 849 00:48:51,280 --> 00:48:53,840 Speaker 1: to do jobs that might be dangerous having a first 850 00:48:53,880 --> 00:48:56,799 Speaker 1: respond to robot to survey a scene to make sure 851 00:48:56,880 --> 00:49:01,000 Speaker 1: that a structure is remaining solid while maybe there was 852 00:49:01,040 --> 00:49:02,920 Speaker 1: a fire and it has to make sure that the 853 00:49:03,880 --> 00:49:08,080 Speaker 1: it's not going to collapse in on rescue mission that 854 00:49:08,160 --> 00:49:10,800 Speaker 1: kind of stuff. Um, But do you need those things 855 00:49:10,840 --> 00:49:13,000 Speaker 1: to be to come out and be able to tell 856 00:49:13,040 --> 00:49:16,640 Speaker 1: a joke or something. And most of them don't need 857 00:49:16,719 --> 00:49:20,239 Speaker 1: to be any sort of humanoid form factor either, which 858 00:49:20,880 --> 00:49:24,480 Speaker 1: greatly simplifies the actual development of the robot and thus 859 00:49:24,520 --> 00:49:26,920 Speaker 1: cuts down on the cost, so you can you can 860 00:49:26,920 --> 00:49:30,439 Speaker 1: achieve the task you're trying to achieve for less money 861 00:49:30,520 --> 00:49:32,440 Speaker 1: than if you are trying to build this this general 862 00:49:32,520 --> 00:49:36,480 Speaker 1: purpose machine. So then we come to this ultimate question, 863 00:49:37,440 --> 00:49:43,000 Speaker 1: why what's the purpose of humanoid robot? Well, I think 864 00:49:43,000 --> 00:49:46,080 Speaker 1: the I think the purposes too fold. One is again 865 00:49:46,200 --> 00:49:51,200 Speaker 1: to have that specific goal in mind that allows you 866 00:49:51,239 --> 00:49:56,359 Speaker 1: to define where your indpoint is. I believe that when 867 00:49:56,360 --> 00:49:59,600 Speaker 1: you have that defined in goal, it makes it easier 868 00:49:59,640 --> 00:50:01,520 Speaker 1: for you to build on the things you need to 869 00:50:01,560 --> 00:50:04,200 Speaker 1: achieve it, as opposed to having an open goal where 870 00:50:04,200 --> 00:50:07,680 Speaker 1: it's just I want to approve improve a I that's 871 00:50:07,719 --> 00:50:09,920 Speaker 1: so open that it's hard to get direction from it. 872 00:50:09,960 --> 00:50:12,600 Speaker 1: But if you think I need to have an artificial 873 00:50:12,640 --> 00:50:16,400 Speaker 1: intelligence that will allow a robot to uh, here's a 874 00:50:16,400 --> 00:50:18,680 Speaker 1: great example. Let's say that the challenge is to have 875 00:50:18,719 --> 00:50:23,000 Speaker 1: a robot leave a room, go down a flight of stairs, 876 00:50:23,560 --> 00:50:27,040 Speaker 1: leave a building, get into a vehicle, drive the vehicle 877 00:50:27,080 --> 00:50:30,440 Speaker 1: to a different location, get out of the vehicle, go 878 00:50:30,480 --> 00:50:33,919 Speaker 1: into another building, break through a wall and put out 879 00:50:33,920 --> 00:50:39,280 Speaker 1: a fire. That's a real, actual robotics challenge challenge. Yeah. 880 00:50:39,360 --> 00:50:42,120 Speaker 1: Dr Henrik Kristensen told me about this. Is I mean, 881 00:50:42,160 --> 00:50:44,600 Speaker 1: it really is a challenge. It's not just a real challenge. 882 00:50:44,640 --> 00:50:48,839 Speaker 1: It's a real challenge. It's like a DARPA challenge. So 883 00:50:49,000 --> 00:50:51,600 Speaker 1: it's a uh he was telling me about this, and 884 00:50:51,680 --> 00:50:53,560 Speaker 1: you start to think about all the things that have 885 00:50:53,640 --> 00:50:54,920 Speaker 1: to fall in line for you to be able to 886 00:50:54,920 --> 00:50:57,200 Speaker 1: achieve the skull That is a valuable thing. But I 887 00:50:57,239 --> 00:50:59,319 Speaker 1: think the other thing is the social aspect. I think 888 00:50:59,360 --> 00:51:02,720 Speaker 1: that there are people who would benefit from a robot 889 00:51:02,840 --> 00:51:06,560 Speaker 1: that is able to give some form of social comfort. 890 00:51:06,640 --> 00:51:11,080 Speaker 1: Let's say for the elderly who need to have some 891 00:51:11,120 --> 00:51:14,960 Speaker 1: form of interaction. Um. You know, that could actually be 892 00:51:15,080 --> 00:51:17,520 Speaker 1: a really valuable tool. And in fact, there's a lot 893 00:51:17,520 --> 00:51:20,919 Speaker 1: of work that's going into robotics to help people like 894 00:51:21,040 --> 00:51:27,760 Speaker 1: the elderly who may have real emotional and psychological problems, um, 895 00:51:27,840 --> 00:51:31,000 Speaker 1: due to loneliness. Do you think that robots are the 896 00:51:31,000 --> 00:51:34,520 Speaker 1: answer to that. I think that robots can help. I 897 00:51:34,560 --> 00:51:36,480 Speaker 1: don't know, I would never go so far as to 898 00:51:36,520 --> 00:51:39,759 Speaker 1: say answer, but couldn't you also make the argument that 899 00:51:40,280 --> 00:51:46,720 Speaker 1: if you created robots that displaced human jobs and also 900 00:51:46,800 --> 00:51:53,640 Speaker 1: simultaneously said, hey, this, this nursing home sector is about 901 00:51:53,680 --> 00:51:56,120 Speaker 1: to explode because we've got a bunch of baby boomers, 902 00:51:56,120 --> 00:51:59,920 Speaker 1: and ways the society of now decided that our elderly 903 00:52:00,040 --> 00:52:05,600 Speaker 1: need human interaction more than we've more than we've carried 904 00:52:05,600 --> 00:52:08,920 Speaker 1: it out before. So let's create this whole other industry. 905 00:52:09,000 --> 00:52:13,400 Speaker 1: Or let's expand this industry of elderly caretakers and fill 906 00:52:13,760 --> 00:52:17,800 Speaker 1: those jobs with people who have been displaced by worker robots. 907 00:52:18,239 --> 00:52:22,720 Speaker 1: Wouldn't that be better? That might be, or you could again, 908 00:52:22,960 --> 00:52:25,640 Speaker 1: looking at the way a lot of roboticists framed this, 909 00:52:26,320 --> 00:52:28,920 Speaker 1: they say, all right, well, it is a reality that 910 00:52:29,080 --> 00:52:34,600 Speaker 1: robots are taking over actual jobs. But the hope is 911 00:52:34,640 --> 00:52:38,239 Speaker 1: that it also ends up creating new jobs that are 912 00:52:38,280 --> 00:52:43,640 Speaker 1: better paying jobs, less dangerous jobs friendly, more old books friendly. 913 00:52:44,480 --> 00:52:47,680 Speaker 1: But yeah, like um, the idea being that that it 914 00:52:47,800 --> 00:52:52,080 Speaker 1: frees up people and encourages the pursuit of jobs and 915 00:52:52,200 --> 00:52:57,560 Speaker 1: engineering in computer science. Now, we live in the real world, 916 00:52:57,640 --> 00:53:00,759 Speaker 1: and we understand that it's a lot more complex than 917 00:53:00,840 --> 00:53:03,720 Speaker 1: telling someone who's been working on a manufacturing line. Hey, 918 00:53:03,960 --> 00:53:06,239 Speaker 1: I'm sorry your job's gone because there's a robot here. 919 00:53:06,239 --> 00:53:08,799 Speaker 1: But guess what, we have an opening and engineering. So 920 00:53:08,840 --> 00:53:11,560 Speaker 1: if you just go and pursue a four year degree 921 00:53:11,600 --> 00:53:14,200 Speaker 1: and then some post graduate work, you'll be right back 922 00:53:14,239 --> 00:53:17,799 Speaker 1: to work. That's that's obviously not uh something that's going 923 00:53:17,840 --> 00:53:21,120 Speaker 1: to be easy, especially in the short term. But the 924 00:53:21,200 --> 00:53:24,000 Speaker 1: long term hope is that more and more of these 925 00:53:24,080 --> 00:53:27,719 Speaker 1: jobs that are are dangerous for people, are less desirable 926 00:53:27,760 --> 00:53:31,480 Speaker 1: for people, will be taken up by robots, and then 927 00:53:31,680 --> 00:53:35,280 Speaker 1: the will there will be the creation of better jobs 928 00:53:35,320 --> 00:53:37,000 Speaker 1: that are higher up on the food chain. And I 929 00:53:37,040 --> 00:53:39,759 Speaker 1: think that makes sense to me. It's just the once 930 00:53:39,840 --> 00:53:44,120 Speaker 1: you enter the sociability, yeah, because without sociability, you there's 931 00:53:44,160 --> 00:53:47,279 Speaker 1: no reason to create a humanoid robot. Everything else can 932 00:53:47,320 --> 00:53:51,239 Speaker 1: look like a robot. Yeah. Um, so it's it's when 933 00:53:51,239 --> 00:53:53,920 Speaker 1: you enter sociability that you lose me. Not only does 934 00:53:53,920 --> 00:53:55,839 Speaker 1: it can it look like a robot, but we can 935 00:53:55,880 --> 00:53:58,239 Speaker 1: still socialize with it even if it doesn't look like 936 00:53:58,280 --> 00:54:01,440 Speaker 1: a human people. There. You know, there's the story Rumba 937 00:54:01,520 --> 00:54:05,000 Speaker 1: owners named their rumba. Yeah, so you we we end 938 00:54:05,080 --> 00:54:09,360 Speaker 1: up having these kind of emotional attachments and investments in 939 00:54:10,000 --> 00:54:12,560 Speaker 1: things that don't look Not only do they not look human, 940 00:54:12,640 --> 00:54:15,680 Speaker 1: they don't look like any other animal that we would 941 00:54:15,680 --> 00:54:20,160 Speaker 1: interact with on a like owner or and pet or whatever. 942 00:54:20,200 --> 00:54:23,440 Speaker 1: I mean, they they're They're just a robot. So I 943 00:54:23,440 --> 00:54:25,160 Speaker 1: think at the end of the day, Josh, I think 944 00:54:25,200 --> 00:54:27,279 Speaker 1: we're on the same page. We think humanoid robots are 945 00:54:27,280 --> 00:54:30,640 Speaker 1: an interesting idea, but not necessarily the end goal. There's 946 00:54:30,719 --> 00:54:34,120 Speaker 1: there's not a whole lot of of incentive to go 947 00:54:34,239 --> 00:54:36,880 Speaker 1: after it for its own purposes. We can see the 948 00:54:36,920 --> 00:54:39,319 Speaker 1: benefits of going after it in the sense of the 949 00:54:39,360 --> 00:54:42,239 Speaker 1: developments that are made in that pursuit help us in 950 00:54:42,280 --> 00:54:45,600 Speaker 1: other ways. And that wraps up this classic episode of 951 00:54:45,640 --> 00:54:47,680 Speaker 1: tech Stuff. I hope you enjoyed it. If you have 952 00:54:47,800 --> 00:54:50,680 Speaker 1: suggestions for topics I should cover in future episodes, please 953 00:54:50,719 --> 00:54:52,880 Speaker 1: reach out to me on Twitter. The handle for the 954 00:54:52,880 --> 00:54:55,960 Speaker 1: show is tech Stuff h SW and I'll tell to 955 00:54:56,000 --> 00:55:04,640 Speaker 1: you again really soon. Text Stuff is an I Heart 956 00:55:04,760 --> 00:55:08,479 Speaker 1: Radio production. For more podcasts from I Heart Radio, visit 957 00:55:08,520 --> 00:55:11,600 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 958 00:55:11,680 --> 00:55:13,000 Speaker 1: listen to your favorite shows.