1 00:00:04,480 --> 00:00:12,360 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey therein 2 00:00:12,520 --> 00:00:16,400 Speaker 1: Welcome to Tech Stuff, I'm your host, Jonathan Strickland. I'm 3 00:00:16,440 --> 00:00:19,880 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,960 --> 00:00:24,600 Speaker 1: are you? So? Last week, Tesla held an event called 5 00:00:24,880 --> 00:00:28,240 Speaker 1: we Robot, in which attendees got to see a new 6 00:00:28,320 --> 00:00:32,480 Speaker 1: vehicle that was dubbed the robo Van, although Elon Musk 7 00:00:32,560 --> 00:00:36,760 Speaker 1: pronounced it as reboven. There was a cyber cab that 8 00:00:37,440 --> 00:00:39,920 Speaker 1: must claims is going to cost less than thirty thousand 9 00:00:39,960 --> 00:00:43,520 Speaker 1: dollars when it goes into production sometime before twenty twenty seven, 10 00:00:43,640 --> 00:00:47,159 Speaker 1: which would potentially allow the average person to run a 11 00:00:47,280 --> 00:00:50,519 Speaker 1: small uber or lift business out of their own garage 12 00:00:50,760 --> 00:00:53,800 Speaker 1: with the cybercab giving people rides. Though that raises a 13 00:00:53,800 --> 00:00:57,640 Speaker 1: lot of questions I have, like liability issues. Let's say 14 00:00:57,640 --> 00:01:00,720 Speaker 1: that your cybercab got into an act extent. Are you 15 00:01:00,840 --> 00:01:03,800 Speaker 1: held liable for that? Or is Tesla or I don't know. 16 00:01:04,240 --> 00:01:07,440 Speaker 1: That's a discussion for another time. But they also gave 17 00:01:07,440 --> 00:01:12,160 Speaker 1: a closer look at the humanoid Optimus robots, and they 18 00:01:12,160 --> 00:01:15,399 Speaker 1: had robots that were dancing and serving drinks and some 19 00:01:15,520 --> 00:01:19,520 Speaker 1: that even held conversations with attendees. Now, those robots had 20 00:01:19,680 --> 00:01:22,520 Speaker 1: some help. Tesla also did not hide this fact. This 21 00:01:22,600 --> 00:01:25,199 Speaker 1: is not like a gotcha because the company was very 22 00:01:25,319 --> 00:01:30,760 Speaker 1: forthright about this. Remote operators were augmenting the robot's abilities 23 00:01:30,840 --> 00:01:33,600 Speaker 1: while those robots were on the floor. So, for example, 24 00:01:33,640 --> 00:01:37,160 Speaker 1: those conversations were actually human beings who were using the 25 00:01:37,240 --> 00:01:41,360 Speaker 1: robots kind of like an advanced bipedal intercom system. But 26 00:01:41,560 --> 00:01:45,080 Speaker 1: it made me think about the long history of humans 27 00:01:45,160 --> 00:01:50,320 Speaker 1: trying to make humanoid robots. Now, in some ways this 28 00:01:50,400 --> 00:01:56,480 Speaker 1: pursuit is a bit strange because a legged, bipedal humanoid 29 00:01:56,600 --> 00:02:00,360 Speaker 1: robot brings with it a ton of challenges that you 30 00:02:00,440 --> 00:02:04,200 Speaker 1: could sidestep if you just made some compromises to your 31 00:02:04,240 --> 00:02:07,480 Speaker 1: design approach, like why does the robot need to be 32 00:02:07,560 --> 00:02:11,720 Speaker 1: bipedal and humanoid? If you decided the robot should move 33 00:02:11,760 --> 00:02:15,600 Speaker 1: around on wheels or tank treads, or maybe move around 34 00:02:15,600 --> 00:02:18,920 Speaker 1: on all fours, or maybe you make like a centaur 35 00:02:19,240 --> 00:02:21,680 Speaker 1: like robot where it has like a base with four 36 00:02:21,800 --> 00:02:25,240 Speaker 1: legs and then a torso with two arms and it 37 00:02:25,280 --> 00:02:28,520 Speaker 1: stands upright. You know, we make the rules like it 38 00:02:28,520 --> 00:02:32,360 Speaker 1: doesn't have to take any specific form factor, and you 39 00:02:32,360 --> 00:02:34,440 Speaker 1: could do that and get around a lot of the 40 00:02:34,520 --> 00:02:38,000 Speaker 1: challenges you would face if you were to instead focus 41 00:02:38,000 --> 00:02:41,919 Speaker 1: on this humanoid bipedal approach. Creating a robot that can 42 00:02:42,000 --> 00:02:46,359 Speaker 1: move the way humans do is hard. It has taken 43 00:02:46,800 --> 00:02:50,880 Speaker 1: decades of research and development to accomplish that to a 44 00:02:50,960 --> 00:02:55,440 Speaker 1: reliable degree, and even then it's typically under very controlled circumstances. 45 00:02:55,720 --> 00:03:00,040 Speaker 1: When you start getting into stuff like uneven terrain, it 46 00:03:00,120 --> 00:03:04,520 Speaker 1: gets a lot trickier. Okay, In other ways, the desire 47 00:03:04,560 --> 00:03:08,600 Speaker 1: to build a human like robot is totally understandable. You know, 48 00:03:08,680 --> 00:03:12,920 Speaker 1: my first reaction to anyone talking about developing a humanoid 49 00:03:13,040 --> 00:03:16,679 Speaker 1: robot is why why are you doing that? What's your reasoning? 50 00:03:16,960 --> 00:03:19,920 Speaker 1: Because if you can accomplish the same goal using a 51 00:03:19,919 --> 00:03:24,520 Speaker 1: different method of locomotion, that might be the better choice. However, 52 00:03:24,840 --> 00:03:27,040 Speaker 1: if you want this robot to be able to do 53 00:03:27,280 --> 00:03:31,760 Speaker 1: tasks in our human world, like tasks that human beings 54 00:03:31,760 --> 00:03:34,680 Speaker 1: would typically carry out on their own, well, making the 55 00:03:34,840 --> 00:03:38,640 Speaker 1: robot human shaped makes more sense. You don't have to 56 00:03:38,960 --> 00:03:43,520 Speaker 1: adapt your environment to the abilities of the robot, right 57 00:03:43,920 --> 00:03:46,960 Speaker 1: because a set of stairs or a ladder would defeat 58 00:03:47,080 --> 00:03:51,600 Speaker 1: most wheeled robots. They would come to it and say, well, 59 00:03:51,960 --> 00:03:56,360 Speaker 1: I can't navigate up this obstacle, and if my goal 60 00:03:56,480 --> 00:03:59,000 Speaker 1: is on the other end of that obstacle, that's a problem. 61 00:03:59,200 --> 00:04:03,080 Speaker 1: So to really maneuver within the human world, it helps 62 00:04:03,120 --> 00:04:08,000 Speaker 1: to have your typical human shape and typical human capabilities. Now, 63 00:04:08,000 --> 00:04:10,680 Speaker 1: the flip side of this is that we humans, we 64 00:04:10,720 --> 00:04:13,720 Speaker 1: have an obligation to make spaces accessible for those who 65 00:04:13,800 --> 00:04:17,359 Speaker 1: have an atypical shape or atypical capabilities. We need to 66 00:04:17,360 --> 00:04:21,240 Speaker 1: make sure that people who don't have the use of say, 67 00:04:21,400 --> 00:04:25,680 Speaker 1: their legs, can still access really important stuff. That's a 68 00:04:25,839 --> 00:04:30,840 Speaker 1: responsibility we have, but that's a topic for another episode. Plus, 69 00:04:30,920 --> 00:04:34,440 Speaker 1: I think there is something inherent within us, or at 70 00:04:34,520 --> 00:04:39,000 Speaker 1: least inherent within some of us, some human beings, and 71 00:04:39,080 --> 00:04:42,400 Speaker 1: it drives us to want to create companions that look 72 00:04:42,600 --> 00:04:46,480 Speaker 1: and to an extent behave the way we do. So 73 00:04:46,839 --> 00:04:51,880 Speaker 1: much of science fiction is based around variations of this idea. 74 00:04:52,120 --> 00:04:57,760 Speaker 1: Arguably Mary Shelley's Frankenstein is kind of along this track 75 00:04:57,800 --> 00:05:00,960 Speaker 1: of thinking. You know, you have you your crazy scientists 76 00:05:01,000 --> 00:05:04,240 Speaker 1: who desires to play god, and then you have General 77 00:05:04,320 --> 00:05:09,160 Speaker 1: Capex's influential work, Rossum's Universal Robots. That's the work where 78 00:05:09,160 --> 00:05:12,039 Speaker 1: we get the term robot in the first place, or 79 00:05:12,240 --> 00:05:17,320 Speaker 1: the replicants in Blade Runner, which actually pretty closely resemble 80 00:05:17,400 --> 00:05:20,839 Speaker 1: the robots in capex work, to the works of Isaac 81 00:05:20,880 --> 00:05:25,440 Speaker 1: Asimov and beyond. So in many of these pieces there's 82 00:05:25,640 --> 00:05:30,160 Speaker 1: a warning that is given that the pursuit to build 83 00:05:30,400 --> 00:05:35,760 Speaker 1: and exploit robots often comes tinged with arrogance and hubris, 84 00:05:36,080 --> 00:05:39,040 Speaker 1: and it rarely works out well for anybody by the 85 00:05:39,120 --> 00:05:43,000 Speaker 1: end of the story. Now, not all of those robots 86 00:05:43,279 --> 00:05:47,880 Speaker 1: were mechanical or electro mechanical or digital creatures. In fact, 87 00:05:48,040 --> 00:05:52,039 Speaker 1: like Frankenstein's Monster, the robots in Capex's work and the 88 00:05:52,120 --> 00:05:56,080 Speaker 1: replicants in Blade Runner are all synthetic life forms. They're 89 00:05:56,120 --> 00:06:00,680 Speaker 1: not masses of wires and circuit boards. Asimov's work introduced 90 00:06:00,800 --> 00:06:06,040 Speaker 1: more mechanical and electro mechanical creatures with artificial brains. But ultimately, 91 00:06:06,400 --> 00:06:11,240 Speaker 1: all the stories involve creatures and creations that gain an 92 00:06:11,279 --> 00:06:14,960 Speaker 1: awareness of themselves and their place in the world, and 93 00:06:15,000 --> 00:06:18,480 Speaker 1: how they reject the hand that has been dealt to them, 94 00:06:18,600 --> 00:06:23,200 Speaker 1: often to dramatic and catastrophic degrees. Now, what the future 95 00:06:23,279 --> 00:06:26,760 Speaker 1: holds as far as humanoid robots go has yet to 96 00:06:26,839 --> 00:06:29,400 Speaker 1: be written, though we can certainly look back and talk 97 00:06:29,400 --> 00:06:32,320 Speaker 1: about some of the history in the field of humanoid 98 00:06:32,400 --> 00:06:36,000 Speaker 1: robots that have happened so far. Before robots, we have 99 00:06:36,080 --> 00:06:41,560 Speaker 1: examples of automata that mimiced human movement and capabilities. Some 100 00:06:41,640 --> 00:06:45,360 Speaker 1: of these were actual clockwork creations, such as the Karakuri. 101 00:06:45,480 --> 00:06:48,160 Speaker 1: These are puppets from Japan, dating as far back as 102 00:06:48,200 --> 00:06:53,120 Speaker 1: the seventeenth century, so the sixteen hundreds. These used mechanisms 103 00:06:53,160 --> 00:06:58,599 Speaker 1: to power certain movements, usually repeatable movements like serving tea 104 00:06:59,240 --> 00:07:04,640 Speaker 1: or playing musical instrument. Some of the early automata weren't 105 00:07:04,680 --> 00:07:08,559 Speaker 1: automata at all. They were hoaxes. The famed mechanical Turk 106 00:07:08,760 --> 00:07:13,720 Speaker 1: creation of Wolfgang von Kemplan is such an example. Kempland 107 00:07:13,760 --> 00:07:17,640 Speaker 1: actually stowed a human operator inside a cabinet that was 108 00:07:17,680 --> 00:07:21,720 Speaker 1: attached to this supposed automaton that had been designed to 109 00:07:21,760 --> 00:07:25,320 Speaker 1: play chess. In reality, it was the human operator hidden 110 00:07:25,320 --> 00:07:28,040 Speaker 1: inside the cabinet that was controlling everything. It was really 111 00:07:28,080 --> 00:07:32,119 Speaker 1: just a puppet, not an automaton. But these creations, whether 112 00:07:32,440 --> 00:07:37,120 Speaker 1: actual automata or not, were limited in function, and typically 113 00:07:37,120 --> 00:07:40,520 Speaker 1: they weren't bipedal either, not truly like they weren't moving 114 00:07:40,560 --> 00:07:43,080 Speaker 1: around on two legs. They might be stationary in the 115 00:07:43,080 --> 00:07:47,320 Speaker 1: case of the mechanical turk, or they might have wheels 116 00:07:47,840 --> 00:07:51,720 Speaker 1: and they have like robes that cover up the wheels, 117 00:07:51,760 --> 00:07:54,040 Speaker 1: so it looks like they're kind of gliding across, but 118 00:07:54,080 --> 00:07:57,360 Speaker 1: they're not actually walking. And I also have to include 119 00:07:57,960 --> 00:08:01,480 Speaker 1: one story I came across because it's just too absurd 120 00:08:01,920 --> 00:08:06,240 Speaker 1: to leave out. So in eighteen forty eight. In May 121 00:08:06,520 --> 00:08:09,960 Speaker 1: of eighteen forty eight, a couple of different journals, one 122 00:08:10,000 --> 00:08:14,080 Speaker 1: of them being Scientific American, published an account of a 123 00:08:14,120 --> 00:08:18,800 Speaker 1: supposed encounter with a remarkable automaton that was capable of 124 00:08:19,280 --> 00:08:24,360 Speaker 1: standing up, sitting down, and even of speaking. The automaton 125 00:08:24,400 --> 00:08:29,440 Speaker 1: itself was apparently almost indistinguishable from a human being. The 126 00:08:29,480 --> 00:08:32,760 Speaker 1: account was said to have originally published in quote an 127 00:08:32,840 --> 00:08:37,120 Speaker 1: Augsburg Gazette end quote. So Augsburg is a city in 128 00:08:37,480 --> 00:08:41,119 Speaker 1: Bavaria in Germany, and so the original article was supposedly 129 00:08:41,360 --> 00:08:44,240 Speaker 1: written in German and then translated into English to be 130 00:08:44,280 --> 00:08:47,800 Speaker 1: published in various periodicals in the United States and elsewhere. 131 00:08:48,160 --> 00:08:53,960 Speaker 1: The automaton's name was mister Eisenbrass, which is great, but 132 00:08:54,120 --> 00:08:58,560 Speaker 1: the name of the inventor was doctor Lube, which, y'all, 133 00:08:58,840 --> 00:09:02,120 Speaker 1: that's like a gift from the gods of comedy right there. 134 00:09:02,360 --> 00:09:06,880 Speaker 1: Mister Eisenbrass and Doctor Lube. I maintain that someone should 135 00:09:06,920 --> 00:09:10,439 Speaker 1: make a stage production that has the title mister Eisenbrass 136 00:09:10,440 --> 00:09:13,240 Speaker 1: and Doctor Lube, and it might well be me unless 137 00:09:13,280 --> 00:09:15,840 Speaker 1: someone beats me to it. And someone probably will beat 138 00:09:15,880 --> 00:09:17,760 Speaker 1: me to it, because I'm infamous for coming up with 139 00:09:17,840 --> 00:09:21,240 Speaker 1: ideas for shows or novels or whatever and then just 140 00:09:21,280 --> 00:09:25,440 Speaker 1: sitting on them. But my goodness, mister Eisenbrass and Doctor Lube. 141 00:09:25,880 --> 00:09:30,160 Speaker 1: That's that's a title, y'all. Anyway, according to this article, 142 00:09:30,360 --> 00:09:33,360 Speaker 1: the author and some other visitors went to the lab 143 00:09:33,800 --> 00:09:37,840 Speaker 1: of doctor Lube, which I imagine was down a slippery slope. Anyway, 144 00:09:38,000 --> 00:09:41,600 Speaker 1: The doctor was quote seated at a sort of cabinet 145 00:09:41,679 --> 00:09:45,120 Speaker 1: having a keyboard somewhat similar to that of a piano 146 00:09:45,240 --> 00:09:48,600 Speaker 1: forte arranged on one side of it, and nearly in 147 00:09:48,640 --> 00:09:52,559 Speaker 1: the center of a room sat a fashionably dressed gentleman 148 00:09:52,720 --> 00:09:56,920 Speaker 1: who rose and bowed as we entered endo quote. And then, 149 00:09:57,040 --> 00:10:00,400 Speaker 1: according to the article, the visitors engaged in some small 150 00:10:00,480 --> 00:10:04,600 Speaker 1: talk with this fashionably dressed gentleman, and the gentleman actually 151 00:10:04,600 --> 00:10:07,880 Speaker 1: took a seat after the visitors had sat down, and 152 00:10:07,960 --> 00:10:11,800 Speaker 1: eventually the doctor stops playing at his keyboard, and mister 153 00:10:11,920 --> 00:10:15,920 Speaker 1: Eisenbrass goes quiet, and Lube explains that the whole thing 154 00:10:16,000 --> 00:10:19,560 Speaker 1: is a mechanical contraption. Only then do the visitors notice 155 00:10:19,679 --> 00:10:22,960 Speaker 1: the cables going from the keyboard console to the chair 156 00:10:23,080 --> 00:10:27,800 Speaker 1: of mister Eisenbrass. Now, according to the piece, Lube procured 157 00:10:28,160 --> 00:10:32,320 Speaker 1: bones from a human being, presumably a dead one, which 158 00:10:32,360 --> 00:10:35,200 Speaker 1: is a big ol' ick already, but then use rubber 159 00:10:35,280 --> 00:10:38,520 Speaker 1: tubes to kind of serve as musculature, and that he 160 00:10:38,600 --> 00:10:42,959 Speaker 1: also created quote a perfect system of nerves made of 161 00:10:43,400 --> 00:10:47,640 Speaker 1: fine platinum wire and covered with silk end Quote to 162 00:10:47,720 --> 00:10:50,240 Speaker 1: what end, you might say, what what are the nerves for? 163 00:10:51,160 --> 00:10:54,960 Speaker 1: I guess the idea is that electric motors would pull 164 00:10:55,040 --> 00:10:58,400 Speaker 1: upon the rubber tubes. It does explain that there were 165 00:10:58,480 --> 00:11:02,280 Speaker 1: electro magnets that were in use in this system, and 166 00:11:02,320 --> 00:11:05,120 Speaker 1: that the tubes just served as muscles that when you 167 00:11:05,559 --> 00:11:09,000 Speaker 1: pulled on them, would cause the contraction you would associate 168 00:11:09,040 --> 00:11:12,079 Speaker 1: with a human body. I actually at first assumed, since 169 00:11:12,080 --> 00:11:14,160 Speaker 1: they were talking about rubber tubes, that this was going 170 00:11:14,200 --> 00:11:17,080 Speaker 1: to be a pneumatic system where you would use air 171 00:11:17,400 --> 00:11:20,360 Speaker 1: to achieve similar results. Right, you pump air into something 172 00:11:20,400 --> 00:11:24,000 Speaker 1: to extend a limb, and you would allow air to 173 00:11:24,240 --> 00:11:27,600 Speaker 1: escape to contract the limb. But that apparently is not 174 00:11:27,640 --> 00:11:31,280 Speaker 1: how this was supposed to work. Supposedly, the keyboard allowed 175 00:11:31,280 --> 00:11:33,840 Speaker 1: the doctor to produce incredible results just by pressing a 176 00:11:33,840 --> 00:11:36,080 Speaker 1: few keys, like I guess there was a key that 177 00:11:36,160 --> 00:11:39,600 Speaker 1: was just labeled small talk or something, and that the 178 00:11:39,679 --> 00:11:46,559 Speaker 1: figure was apparently capable of quote walking, talking, singing, playing 179 00:11:46,600 --> 00:11:49,800 Speaker 1: the piano, and doing many other things with as much 180 00:11:49,920 --> 00:11:54,160 Speaker 1: ease and precision as an accomplished man. End quote. The 181 00:11:54,240 --> 00:11:58,800 Speaker 1: author then proactively chides the reader for undoubtedly asking so 182 00:11:59,280 --> 00:12:01,920 Speaker 1: what good is on this? And then the author goes 183 00:12:01,960 --> 00:12:04,920 Speaker 1: on to talk about how mechanical servants will replace all 184 00:12:05,000 --> 00:12:09,400 Speaker 1: those undependable lauts and scally wags who currently act as servants, 185 00:12:09,679 --> 00:12:12,520 Speaker 1: and thus give the women of the household the freedom 186 00:12:12,600 --> 00:12:15,640 Speaker 1: to carry out their feminine duties as caretaker of the 187 00:12:15,679 --> 00:12:19,199 Speaker 1: home just by tickling some ivories. Yeah, this article is 188 00:12:19,280 --> 00:12:24,440 Speaker 1: well and truly both sexist and classist. Anyway, to say 189 00:12:24,440 --> 00:12:28,120 Speaker 1: that I am skeptical about this account is putting it lightly. 190 00:12:28,440 --> 00:12:33,080 Speaker 1: I feel fairly certain that no such demonstration ever actually 191 00:12:33,120 --> 00:12:37,160 Speaker 1: took place, or if there were some kind of demonstration, 192 00:12:37,280 --> 00:12:40,840 Speaker 1: it did not unfold as described in this article. I 193 00:12:40,920 --> 00:12:44,320 Speaker 1: did try to find the original article written in German. 194 00:12:44,760 --> 00:12:48,720 Speaker 1: I assumed that the German gazette that was referenced in 195 00:12:48,800 --> 00:12:52,840 Speaker 1: the English piece must have been the Algemina Zeitung that 196 00:12:52,960 --> 00:12:56,600 Speaker 1: was published in Augsburg, Germany for most of the nineteenth century, 197 00:12:56,679 --> 00:12:59,760 Speaker 1: and was like the main paper not just of Augsburg, 198 00:12:59,800 --> 00:13:03,199 Speaker 1: but like of that region of Germany. However, I found 199 00:13:03,240 --> 00:13:09,160 Speaker 1: no record of Eisenbrass or doctor Lube anyway. As diverting 200 00:13:09,240 --> 00:13:12,640 Speaker 1: as mister Eisenbrass and doctor Lube are, I feel confident 201 00:13:12,640 --> 00:13:16,520 Speaker 1: in saying that the capability of building a robot that 202 00:13:16,559 --> 00:13:20,920 Speaker 1: could maintain its balance standing still, let alone walking around 203 00:13:21,040 --> 00:13:25,160 Speaker 1: all without other means of support, was likely well outside 204 00:13:25,200 --> 00:13:28,640 Speaker 1: the reach of even the most clever of innovators in 205 00:13:28,679 --> 00:13:31,720 Speaker 1: the mid eighteen hundreds. That seems like that's just a 206 00:13:31,800 --> 00:13:34,959 Speaker 1: no brainer. When it comes to creating a two legged robot, 207 00:13:35,280 --> 00:13:38,640 Speaker 1: one where most of the mass is actually above the 208 00:13:38,720 --> 00:13:42,760 Speaker 1: legs of the robot, not contained within the legs, things 209 00:13:42,760 --> 00:13:45,320 Speaker 1: get really tricky because a lot of physics have to 210 00:13:45,320 --> 00:13:49,600 Speaker 1: be considered before engineers could get serious about bipedal robots. 211 00:13:49,840 --> 00:13:53,040 Speaker 1: If we relied solely on trial and error and just 212 00:13:53,080 --> 00:13:55,840 Speaker 1: figured we're going to get this right, we'd likely not 213 00:13:55,920 --> 00:13:59,120 Speaker 1: be anywhere as far along as we are right now. 214 00:13:59,440 --> 00:14:03,040 Speaker 1: So when we come back, we're going to consider how 215 00:14:03,240 --> 00:14:07,440 Speaker 1: challenging it is to create something that walks around on 216 00:14:07,559 --> 00:14:10,240 Speaker 1: two legs. But before we get to that, and I 217 00:14:10,280 --> 00:14:14,080 Speaker 1: get a little unbalanced, let's take a quick break to 218 00:14:14,120 --> 00:14:27,080 Speaker 1: thank our sponsors. All right, So what's the big deal 219 00:14:27,120 --> 00:14:29,640 Speaker 1: with walking around on two legs? Lots of people do 220 00:14:29,720 --> 00:14:32,600 Speaker 1: it all the time. You know, toddlers can get the 221 00:14:32,640 --> 00:14:36,680 Speaker 1: hang of it without too much trouble, and we celebrate 222 00:14:36,720 --> 00:14:38,800 Speaker 1: it when it happens like that's a big deal, But 223 00:14:38,840 --> 00:14:41,920 Speaker 1: then shortly after that it's It really becomes just a 224 00:14:41,960 --> 00:14:46,600 Speaker 1: source of stress as the toddlers toddle along toward one 225 00:14:46,680 --> 00:14:51,120 Speaker 1: danger or another. But you're entirely dependent upon just two 226 00:14:51,240 --> 00:14:54,840 Speaker 1: points of contact with your surrounding environment. If you're talking 227 00:14:54,840 --> 00:14:59,680 Speaker 1: about a true bipedal form that is capable of moving 228 00:14:59,720 --> 00:15:03,680 Speaker 1: around round the area, and those two points of contact 229 00:15:03,680 --> 00:15:07,320 Speaker 1: with the environment are essentially the bottoms of the foot seats. That's, 230 00:15:07,520 --> 00:15:10,120 Speaker 1: of course, if everything is actually going as you wanted 231 00:15:10,160 --> 00:15:12,440 Speaker 1: it to. If things have gone poorly, you might actually 232 00:15:12,440 --> 00:15:14,720 Speaker 1: have lots of different points of contact with the ground. 233 00:15:14,800 --> 00:15:17,800 Speaker 1: But that's because you went horizontal after something went wrong. 234 00:15:18,200 --> 00:15:21,840 Speaker 1: So you've got your robot. It's got two legs, the 235 00:15:21,960 --> 00:15:23,800 Speaker 1: two points of contact with the ground or the bottom 236 00:15:23,800 --> 00:15:26,960 Speaker 1: of the feet likely the legs and the feet, and 237 00:15:27,200 --> 00:15:30,200 Speaker 1: your robot in general has a number of degrees of freedom. 238 00:15:30,640 --> 00:15:34,120 Speaker 1: So degrees of freedom are joints that allow movement along 239 00:15:34,240 --> 00:15:37,400 Speaker 1: at least one axis. The point of contact with the 240 00:15:37,440 --> 00:15:41,040 Speaker 1: ground could actually be considered a passive degree of freedom 241 00:15:41,360 --> 00:15:45,000 Speaker 1: in itself. And you're also relying on friction to allow 242 00:15:45,040 --> 00:15:49,560 Speaker 1: your robot to stand up, to maintain balance, and to 243 00:15:49,680 --> 00:15:53,080 Speaker 1: get anywhere. If the robot's feet were frictionless, then it 244 00:15:53,120 --> 00:15:54,800 Speaker 1: wouldn't be able to stay up right at all, let 245 00:15:54,840 --> 00:15:58,160 Speaker 1: alone walk. And you've got all this weight above the 246 00:15:58,280 --> 00:16:01,880 Speaker 1: legs that you have to worry about. So have you 247 00:16:01,960 --> 00:16:05,960 Speaker 1: ever balanced something like, say, a baseball bat, on the 248 00:16:06,000 --> 00:16:08,640 Speaker 1: palm of your hand. If you do that with a 249 00:16:08,640 --> 00:16:11,680 Speaker 1: baseball bat and you're using the narrow part the handle 250 00:16:11,720 --> 00:16:13,680 Speaker 1: in of the words of the bat and you're balancing 251 00:16:13,680 --> 00:16:15,840 Speaker 1: that on your palm and the thick part of the 252 00:16:15,880 --> 00:16:18,760 Speaker 1: bat is up in the air, you know that little 253 00:16:18,800 --> 00:16:23,720 Speaker 1: motions can create big results. Right A small movement at 254 00:16:23,720 --> 00:16:26,000 Speaker 1: the base can cause the top to really sway, and 255 00:16:26,040 --> 00:16:28,600 Speaker 1: that requires you to make larger corrections down at the 256 00:16:28,640 --> 00:16:31,040 Speaker 1: base in order to keep everything in balance. Well, that's 257 00:16:31,160 --> 00:16:32,960 Speaker 1: kind of what it's like to try and figure out 258 00:16:32,960 --> 00:16:37,200 Speaker 1: how to make a bipedal robot walk while maintaining its balance. 259 00:16:37,240 --> 00:16:40,400 Speaker 1: You've got a inertia to deal with, and that really 260 00:16:40,440 --> 00:16:44,160 Speaker 1: affects balance. How do you make sure the mass above 261 00:16:44,240 --> 00:16:47,840 Speaker 1: the legs doesn't throw everything off kilter whenever it starts 262 00:16:47,840 --> 00:16:50,760 Speaker 1: moving or when it stops moving. How do you correct 263 00:16:50,760 --> 00:16:53,960 Speaker 1: for that so that your robot doesn't just tumble over? 264 00:16:54,000 --> 00:16:57,760 Speaker 1: How do you keep your bot upright? One early discussion 265 00:16:57,840 --> 00:17:01,359 Speaker 1: that became a core component in the pursuit of bipedal 266 00:17:01,440 --> 00:17:05,600 Speaker 1: robots revolved around a concept that came to be known 267 00:17:05,920 --> 00:17:10,920 Speaker 1: as the zero moment point or ZMP. So a pair 268 00:17:10,960 --> 00:17:15,800 Speaker 1: of smarty pants is from Russia named Beyomir Vukabratovich and 269 00:17:16,119 --> 00:17:21,679 Speaker 1: Davor Jurichic first described this back in the nineteen sixties. Now, 270 00:17:21,760 --> 00:17:25,080 Speaker 1: the actual term zero moment point would be coined a 271 00:17:25,119 --> 00:17:27,840 Speaker 1: little bit later, but it was used to describe what 272 00:17:28,040 --> 00:17:32,080 Speaker 1: they were talking about, and the whole concept revolves around 273 00:17:32,160 --> 00:17:35,600 Speaker 1: a moment in which the net reaction forces between a 274 00:17:35,680 --> 00:17:41,719 Speaker 1: bipedal mechanism's feet and the ground are essentially zero and 275 00:17:41,760 --> 00:17:45,639 Speaker 1: there's no movement along the horizontal plane. So that means, 276 00:17:45,680 --> 00:17:50,080 Speaker 1: like forward momentum and gravity they've kind of canceled each 277 00:17:50,080 --> 00:17:53,680 Speaker 1: other out. You've hit this zero moment where you don't 278 00:17:53,720 --> 00:17:56,920 Speaker 1: have to worry about the robot tipping forward and falling over. 279 00:17:57,320 --> 00:17:59,960 Speaker 1: It is you could think of as a moment of stability, 280 00:18:00,480 --> 00:18:04,240 Speaker 1: and the robot, assuming no external forces are acting upon it, 281 00:18:04,320 --> 00:18:09,879 Speaker 1: will remain upright, assuming that ZMP is maintained. Now, this 282 00:18:09,960 --> 00:18:14,640 Speaker 1: discussion really drives home how stability can be a huge challenge. 283 00:18:14,720 --> 00:18:17,919 Speaker 1: As robots move, they must deal with inertia and you 284 00:18:18,040 --> 00:18:21,440 Speaker 1: have to know the math to achieve dynamic stability if 285 00:18:21,440 --> 00:18:26,280 Speaker 1: your robot is to remain upright, whether it's walking, running, jumping, 286 00:18:26,600 --> 00:18:29,280 Speaker 1: or whatever. And in fact, it gets increasingly more difficult 287 00:18:29,520 --> 00:18:32,919 Speaker 1: as you go down those tasks. Walking is hard, running 288 00:18:33,000 --> 00:18:37,560 Speaker 1: is really hard, and jumping is really really hard. Like 289 00:18:38,119 --> 00:18:40,920 Speaker 1: the jump part might be easy, the landing and staying 290 00:18:41,000 --> 00:18:44,800 Speaker 1: upright that's the hard part. Otherwise your robot's gonna topple over. 291 00:18:45,080 --> 00:18:47,199 Speaker 1: And while watching a robot take a tumble might be 292 00:18:47,240 --> 00:18:49,879 Speaker 1: a great YouTube video. In practice, in the real world, 293 00:18:50,040 --> 00:18:52,560 Speaker 1: you obviously don't want your robots to be falling over. 294 00:18:52,680 --> 00:18:55,199 Speaker 1: You want your robots to be stable and capable of 295 00:18:55,240 --> 00:19:00,399 Speaker 1: moving around environments without causing damage or potentially injury to people, 296 00:19:00,800 --> 00:19:04,879 Speaker 1: to be able to maneuver. It's tough and robots are expensive. 297 00:19:05,080 --> 00:19:07,960 Speaker 1: You don't want them falling over. Then you're thinking, well, 298 00:19:08,080 --> 00:19:10,520 Speaker 1: that's twenty thousand dollars to get this thing back on 299 00:19:10,560 --> 00:19:13,680 Speaker 1: its feet. Again, that's not a great way to reach 300 00:19:13,760 --> 00:19:17,760 Speaker 1: progress either. So to plan the motion for a robot, 301 00:19:18,119 --> 00:19:20,399 Speaker 1: you need to be able to calculate the zero moment 302 00:19:20,440 --> 00:19:23,920 Speaker 1: point or ZMP. You need to figure out which joints 303 00:19:23,920 --> 00:19:26,280 Speaker 1: the robot is going to have to engage in order 304 00:19:26,359 --> 00:19:30,560 Speaker 1: to achieve stability under its various operating conditions, and those 305 00:19:30,560 --> 00:19:34,960 Speaker 1: conditions could include things outside of your strict control. It's 306 00:19:35,000 --> 00:19:38,280 Speaker 1: one thing to calculate how the robot can achieve stability 307 00:19:38,440 --> 00:19:41,399 Speaker 1: when it's walking across a level floor that has a 308 00:19:41,440 --> 00:19:45,240 Speaker 1: firm surface, But what about a floor that surface isn't 309 00:19:45,400 --> 00:19:47,800 Speaker 1: as firm. Maybe it's a little squishy, you know, maybe 310 00:19:47,840 --> 00:19:51,520 Speaker 1: it's a nineteen seventies shag carpet or something, or what 311 00:19:51,600 --> 00:19:54,720 Speaker 1: if the floor isn't level, what if the terrain is 312 00:19:54,800 --> 00:19:57,560 Speaker 1: actually uneven, you know, kind of like a typical sidewalk 313 00:19:57,560 --> 00:20:00,960 Speaker 1: in the city of Atlanta. How does a robot compensate 314 00:20:01,080 --> 00:20:05,200 Speaker 1: for all this, remain stable and keep itself from pitching over? 315 00:20:05,720 --> 00:20:09,000 Speaker 1: This is a non trivial challenge, and it takes a 316 00:20:09,040 --> 00:20:11,320 Speaker 1: lot of work to get to a point where robots 317 00:20:11,440 --> 00:20:16,320 Speaker 1: are sophisticated enough to achieve stability. Engineers have had to 318 00:20:16,359 --> 00:20:19,800 Speaker 1: take a lot into consideration. Would more degrees of freedom 319 00:20:19,880 --> 00:20:23,080 Speaker 1: help or does that actually overcomplicate matters? I mean, there's 320 00:20:23,080 --> 00:20:26,480 Speaker 1: no reason why we should be constrained to the same 321 00:20:26,520 --> 00:20:30,520 Speaker 1: degrees of freedom that a person has. Right Like, we 322 00:20:30,560 --> 00:20:33,919 Speaker 1: could think, oh, let's mimic the way humans work and 323 00:20:34,160 --> 00:20:36,560 Speaker 1: make sure that the ankles and the knees and the 324 00:20:36,640 --> 00:20:39,439 Speaker 1: hips all have the same points of articulation. Or we 325 00:20:39,480 --> 00:20:42,000 Speaker 1: could say, well, there's no reason why we couldn't have 326 00:20:42,119 --> 00:20:46,639 Speaker 1: more or fewer joints if it makes the operations work better. 327 00:20:47,000 --> 00:20:50,359 Speaker 1: So that's something to think about, you know. And then 328 00:20:50,880 --> 00:20:54,360 Speaker 1: walking would require shifting stability so that the robots can 329 00:20:54,400 --> 00:20:58,000 Speaker 1: maintain itself with just one foot in contact with the ground, 330 00:20:58,400 --> 00:21:03,000 Speaker 1: right like, suddenly points of contact have halved. If you're running, 331 00:21:03,040 --> 00:21:06,640 Speaker 1: it's even harder because with running, at least the definition 332 00:21:06,720 --> 00:21:10,520 Speaker 1: of running that roboticists use, there's a moment that may 333 00:21:10,560 --> 00:21:12,720 Speaker 1: only last just for a split second, but there's a 334 00:21:12,760 --> 00:21:15,639 Speaker 1: moment in which both of the robots feet are not 335 00:21:16,160 --> 00:21:19,480 Speaker 1: touching the ground. So how do you achieve stability when 336 00:21:19,520 --> 00:21:25,160 Speaker 1: your point of contact is continually interrupted? Jumping is even harder, right, 337 00:21:25,240 --> 00:21:28,879 Speaker 1: because you're really leaving the ground then, and how do 338 00:21:29,000 --> 00:21:32,120 Speaker 1: you ensure that landing you do so in a way 339 00:21:32,160 --> 00:21:36,640 Speaker 1: that you maintain stability. So that ZMP was a huge deal. 340 00:21:36,680 --> 00:21:39,919 Speaker 1: It still is a huge deal. Not all robot locomotion 341 00:21:40,119 --> 00:21:43,080 Speaker 1: is centered around ZMP calculations, by the way, but a 342 00:21:43,119 --> 00:21:44,760 Speaker 1: lot of it still is. So a lot of the 343 00:21:44,800 --> 00:21:48,800 Speaker 1: work in bipedal robots, particularly in the seventies and eighties, 344 00:21:49,080 --> 00:21:52,520 Speaker 1: involved calculating ZMP and keeping the robot within that zone 345 00:21:52,880 --> 00:21:57,000 Speaker 1: while engineering movements. So calculating ZMP is one thing. Building 346 00:21:57,000 --> 00:22:00,400 Speaker 1: a robot that can balance is another. Even building robot 347 00:22:00,440 --> 00:22:04,400 Speaker 1: that's capable of static stability is no simple task. Static 348 00:22:04,400 --> 00:22:07,919 Speaker 1: stability just means the robot is able to stand still 349 00:22:08,080 --> 00:22:10,680 Speaker 1: and not fall over, and believe it or not, that's 350 00:22:10,720 --> 00:22:14,760 Speaker 1: easier said than done. In nineteen seventy four, labs at 351 00:22:14,760 --> 00:22:18,000 Speaker 1: the School of Science and Engineering at Waseda University in 352 00:22:18,080 --> 00:22:21,000 Speaker 1: Japan started a project with the goal of building a 353 00:22:21,040 --> 00:22:26,960 Speaker 1: stable bipedal robot. This project became known as Weibot wabot 354 00:22:27,720 --> 00:22:32,120 Speaker 1: that stands for Waseda Robot. The labs created a new 355 00:22:32,200 --> 00:22:36,320 Speaker 1: focus group within the laboratory system at the university. It's 356 00:22:36,320 --> 00:22:40,680 Speaker 1: called the Bioengineering Group, and their first effort, the Waybot one, 357 00:22:41,000 --> 00:22:45,280 Speaker 1: wasn't exactly something that you would mistake as a human being. 358 00:22:45,359 --> 00:22:48,400 Speaker 1: It was not like a smooth humanoid robot. You would 359 00:22:48,480 --> 00:22:50,960 Speaker 1: never look at it and think it was anything other 360 00:22:51,359 --> 00:22:53,439 Speaker 1: than a robot. In fact, it looked kind of like 361 00:22:53,520 --> 00:22:57,040 Speaker 1: a giant erector set. It was very much a big, blocky, 362 00:22:57,200 --> 00:23:02,240 Speaker 1: metallic humanoid robot. So it had a limb control system, 363 00:23:02,320 --> 00:23:05,160 Speaker 1: obviously very important if you're going to have a walking robot. 364 00:23:05,400 --> 00:23:08,199 Speaker 1: It also had a primitive vision system. It even was 365 00:23:08,240 --> 00:23:11,479 Speaker 1: able to converse in Japanese to a certain degree. They 366 00:23:11,520 --> 00:23:14,159 Speaker 1: said that it had the in electrical capacity of like 367 00:23:14,160 --> 00:23:16,240 Speaker 1: a one and a half year old Keep in mind 368 00:23:16,440 --> 00:23:19,840 Speaker 1: this is the early nineteen seventies. It was tethered to 369 00:23:20,000 --> 00:23:22,959 Speaker 1: computer systems and power systems. There was not yet a 370 00:23:23,000 --> 00:23:26,440 Speaker 1: point where we had reached a miniaturization where you could 371 00:23:26,480 --> 00:23:29,240 Speaker 1: have all that computing power, not to mention electrical power 372 00:23:29,480 --> 00:23:32,440 Speaker 1: on board the robot itself. If you had done that, 373 00:23:32,640 --> 00:23:35,320 Speaker 1: the robot would either need to be huge or would 374 00:23:35,320 --> 00:23:37,960 Speaker 1: be carrying the biggest backpack you've ever seen in order 375 00:23:37,960 --> 00:23:41,600 Speaker 1: to have all the computational and electrical power necessary to 376 00:23:41,640 --> 00:23:45,240 Speaker 1: operate this thing. So, yeah, it was tethered. The Waybot 377 00:23:45,400 --> 00:23:49,560 Speaker 1: was the first humanoid robot to achieve static stability, and 378 00:23:49,720 --> 00:23:54,200 Speaker 1: later it would be the first digitally controlled anthropomorphic robot 379 00:23:54,280 --> 00:23:59,440 Speaker 1: at all that was able to achieve dynamic stability, and 380 00:24:00,000 --> 00:24:03,160 Speaker 1: statically stable means that the robot can remain balanced while 381 00:24:03,160 --> 00:24:07,240 Speaker 1: standing still. Dynamically balanced refers to robots that are maintaining 382 00:24:07,280 --> 00:24:10,399 Speaker 1: that stability while they are still in motion. Going from 383 00:24:10,520 --> 00:24:13,280 Speaker 1: one to the other and back again is actually really hard. 384 00:24:13,480 --> 00:24:17,280 Speaker 1: Like achieving one is hard, achieving the other one is 385 00:24:17,320 --> 00:24:21,120 Speaker 1: also hard. Going back and forth between still and moving 386 00:24:21,280 --> 00:24:25,640 Speaker 1: and maintaining balance is even harder. You know, remaining dynamically 387 00:24:25,680 --> 00:24:31,640 Speaker 1: stable is often easier than going from dynamic to static, 388 00:24:32,119 --> 00:24:34,720 Speaker 1: assuming that you know motions are smooth, and fast enough 389 00:24:34,720 --> 00:24:37,679 Speaker 1: to counter the forces acting on the robot, because when 390 00:24:37,720 --> 00:24:42,080 Speaker 1: you think about it, walking is really a series of falls. 391 00:24:42,520 --> 00:24:45,679 Speaker 1: It's like you're falling and you're catching yourself over and 392 00:24:45,720 --> 00:24:48,520 Speaker 1: over again. You move forward when you walk, not just 393 00:24:48,560 --> 00:24:51,359 Speaker 1: by moving your legs. You know, you kind of lean 394 00:24:51,480 --> 00:24:54,119 Speaker 1: forward into the walk as well, and for a moment 395 00:24:54,400 --> 00:24:57,480 Speaker 1: it's as if you're falling forward and that you would 396 00:24:57,520 --> 00:25:00,280 Speaker 1: face plant into the ground ahead of you. But of 397 00:25:00,280 --> 00:25:03,800 Speaker 1: course you've moved your leg to catch yourself, and this 398 00:25:03,880 --> 00:25:08,919 Speaker 1: process repeats, so you're propelling yourself forward by constantly almost 399 00:25:08,960 --> 00:25:12,560 Speaker 1: falling but catching yourself over and over and over again. 400 00:25:12,800 --> 00:25:15,040 Speaker 1: Now we do this without really thinking about it, like 401 00:25:15,080 --> 00:25:17,199 Speaker 1: once we learn how to walk, we don't have to 402 00:25:17,240 --> 00:25:19,399 Speaker 1: focus on this. This is this is just how we 403 00:25:19,440 --> 00:25:22,080 Speaker 1: do it. Robots, however, have to calculate this stuff in 404 00:25:22,240 --> 00:25:24,400 Speaker 1: order to do it properly, in order to catch themselves 405 00:25:24,440 --> 00:25:27,359 Speaker 1: with just the right amount of force to keep things 406 00:25:27,440 --> 00:25:30,480 Speaker 1: moving and not to use too much or too little 407 00:25:30,520 --> 00:25:34,520 Speaker 1: force and then risk taking a fall. Tons of work 408 00:25:34,560 --> 00:25:38,399 Speaker 1: in robotics continued around the world towards this goal of 409 00:25:38,440 --> 00:25:42,399 Speaker 1: creating bipedal humanoid robots, but nearly all the articles I 410 00:25:42,480 --> 00:25:45,240 Speaker 1: read for this episode. Make a big jump from the 411 00:25:45,320 --> 00:25:49,320 Speaker 1: nineteen seventies to the mid nineteen nineties. That's when Honda 412 00:25:49,480 --> 00:25:52,840 Speaker 1: unveiled a robot that they called the humanoid P two. 413 00:25:53,520 --> 00:25:56,760 Speaker 1: So Honda had actually been developing humanoid robots for about 414 00:25:56,760 --> 00:26:00,960 Speaker 1: a decade before showing off the P two. A series 415 00:26:01,080 --> 00:26:03,919 Speaker 1: that was kind of the in secret R and D 416 00:26:04,119 --> 00:26:08,160 Speaker 1: works that was called the E series of robots, and 417 00:26:08,240 --> 00:26:11,720 Speaker 1: this was in the nineteen eighties. These E robots were primitive, 418 00:26:11,760 --> 00:26:15,200 Speaker 1: but they demonstrate the capability to walk on level ground 419 00:26:15,400 --> 00:26:20,960 Speaker 1: under very controlled circumstances. The first E robot, for example, 420 00:26:21,040 --> 00:26:24,119 Speaker 1: was able to walk at an extremely deliberate pace because 421 00:26:24,119 --> 00:26:27,200 Speaker 1: it was said that each step took about five seconds 422 00:26:27,200 --> 00:26:31,400 Speaker 1: to complete. I challenge you to try walking that way, 423 00:26:31,480 --> 00:26:33,560 Speaker 1: even if it's just a couple of feet, count to 424 00:26:33,640 --> 00:26:39,320 Speaker 1: five slowly per step. That is slow, but it illustrates 425 00:26:39,320 --> 00:26:42,159 Speaker 1: how challenging it was to design a robot capable of 426 00:26:42,200 --> 00:26:45,320 Speaker 1: walking in a biped away. Now, the first E series 427 00:26:45,440 --> 00:26:48,520 Speaker 1: robot was built in nineteen eighty six. It essentially looked 428 00:26:48,560 --> 00:26:51,280 Speaker 1: like a pair of legs attached to robotic hips and 429 00:26:51,320 --> 00:26:53,600 Speaker 1: that's it like. It didn't have any top half. There 430 00:26:53,640 --> 00:26:56,679 Speaker 1: was no torso or arms or anything like that. The 431 00:26:56,760 --> 00:27:00,520 Speaker 1: B two wouldn't debut until nineteen ninety six. Honda had 432 00:27:00,560 --> 00:27:04,480 Speaker 1: a decade of work developing humanoid robotics before they revealed 433 00:27:04,520 --> 00:27:07,320 Speaker 1: to the public what they had been up to. And 434 00:27:07,600 --> 00:27:11,359 Speaker 1: like I said, E zero, not very humanoid. It is 435 00:27:11,440 --> 00:27:17,120 Speaker 1: like a free, free walking pelvis, robotic pelvis. The later 436 00:27:17,280 --> 00:27:20,240 Speaker 1: robots in the E series started to look I don't know, 437 00:27:20,280 --> 00:27:22,479 Speaker 1: in my opinion, even more strange, Like one of them 438 00:27:22,560 --> 00:27:25,400 Speaker 1: looks kind of like a microwave oven that has legs. 439 00:27:25,960 --> 00:27:28,120 Speaker 1: One of them looks sort of like the front end 440 00:27:28,200 --> 00:27:32,600 Speaker 1: of a fancy car, like a Rolls Royce, but with legs. 441 00:27:32,960 --> 00:27:35,520 Speaker 1: You would never call any of them human in shape, 442 00:27:35,600 --> 00:27:39,760 Speaker 1: but they were gradually evolving toward that. The P two 443 00:27:39,840 --> 00:27:43,679 Speaker 1: in nineteen ninety six looked much more human in that 444 00:27:44,000 --> 00:27:47,320 Speaker 1: it had legs, and it had arms and a torso 445 00:27:47,640 --> 00:27:51,440 Speaker 1: and a head. Now this head was rectangular, and it 446 00:27:51,560 --> 00:27:54,720 Speaker 1: was it was wider than it was tall. It was 447 00:27:54,800 --> 00:27:57,600 Speaker 1: not human looking at all, but it demonstrated that Honda 448 00:27:57,640 --> 00:28:01,080 Speaker 1: had been hard at work tackling this BikeE Hetle challenge, 449 00:28:01,320 --> 00:28:03,880 Speaker 1: and it would serve as a foundation for a much 450 00:28:03,920 --> 00:28:07,280 Speaker 1: more famous Honda robot just a few years later, a 451 00:28:07,359 --> 00:28:11,000 Speaker 1: robot that would debut in two thousand. It was called Asmo. 452 00:28:11,160 --> 00:28:14,320 Speaker 1: Now I have a fun connection with Asimo, and I 453 00:28:14,400 --> 00:28:19,560 Speaker 1: will talk about that, but first let's take another quick break. 454 00:28:29,280 --> 00:28:32,680 Speaker 1: All right, So the year was two thousand and seven. 455 00:28:32,840 --> 00:28:35,320 Speaker 1: Asimo had been a thing for the better part of 456 00:28:35,320 --> 00:28:38,120 Speaker 1: the two thousands, like it debuted in early two thousand 457 00:28:38,200 --> 00:28:40,560 Speaker 1: and now it's two thousand and seven. I had just 458 00:28:40,640 --> 00:28:45,719 Speaker 1: been hired by a company called HowStuffWorks dot com, and 459 00:28:45,760 --> 00:28:49,280 Speaker 1: one of my first assignments was to rewrite and to 460 00:28:49,440 --> 00:28:54,400 Speaker 1: update the article on how Asimo works. So I researched 461 00:28:54,400 --> 00:28:57,560 Speaker 1: the project, including the lofty goal that the engineers had 462 00:28:57,600 --> 00:29:00,600 Speaker 1: set to have a robot that couldn't just walk but 463 00:29:00,680 --> 00:29:03,720 Speaker 1: could also run. So it would be a robot that, 464 00:29:04,040 --> 00:29:07,600 Speaker 1: at little points would have both feet leave the ground 465 00:29:07,840 --> 00:29:10,240 Speaker 1: just for a moment, but still be able to maintain 466 00:29:10,280 --> 00:29:14,080 Speaker 1: its balance. Now that was a huge accomplishment. Even if 467 00:29:14,120 --> 00:29:15,960 Speaker 1: you were to watch videos and it kind of looks 468 00:29:15,960 --> 00:29:18,400 Speaker 1: like Asmo's doing a little hopping dance you might do 469 00:29:18,480 --> 00:29:20,840 Speaker 1: if you were in need of getting to a restroom, 470 00:29:21,080 --> 00:29:22,520 Speaker 1: Like I think of it as oh, it's doing the 471 00:29:22,560 --> 00:29:26,600 Speaker 1: peepee dance. But you know, watch Asmo running. It's cute, 472 00:29:26,640 --> 00:29:30,680 Speaker 1: it's weird, it has this odd sort of tone to it, 473 00:29:30,720 --> 00:29:34,360 Speaker 1: I would say, But it was phenomenal because yes, for 474 00:29:34,440 --> 00:29:37,320 Speaker 1: a split second, both feet are off the ground, and 475 00:29:37,400 --> 00:29:41,640 Speaker 1: yet when the opposite foot makes contact with the floor, 476 00:29:42,120 --> 00:29:45,320 Speaker 1: the robot would maintain its balance and be able to 477 00:29:45,360 --> 00:29:49,360 Speaker 1: continue running. Asimo looks a lot more human than P 478 00:29:49,520 --> 00:29:51,920 Speaker 1: two did. In fact, it looks like sort of a 479 00:29:51,960 --> 00:29:56,200 Speaker 1: diminutive astronaut in a spacesuit. I actually got to meet 480 00:29:56,320 --> 00:29:59,720 Speaker 1: Asimo once when I was at Disneyland in California, because 481 00:29:59,760 --> 00:30:02,880 Speaker 1: there was demonstration of Asimo that was a real blast. 482 00:30:02,920 --> 00:30:05,040 Speaker 1: I watched this presentation that they gave, and then I 483 00:30:05,080 --> 00:30:06,920 Speaker 1: mentioned to a cast member that I had written an 484 00:30:07,040 --> 00:30:09,960 Speaker 1: article about how Asima worked, and they brought me aside 485 00:30:09,960 --> 00:30:11,760 Speaker 1: and I got to meet the robot, which was a 486 00:30:11,800 --> 00:30:14,960 Speaker 1: really fun moment for me. That was kind of cool. Anyway, 487 00:30:15,280 --> 00:30:18,480 Speaker 1: Asimo would go on to establish a lot of firsts 488 00:30:18,640 --> 00:30:22,959 Speaker 1: in the bipedal humanoid robot space. Not only was it 489 00:30:23,000 --> 00:30:26,280 Speaker 1: the first one to run, it could also climb and 490 00:30:26,360 --> 00:30:30,960 Speaker 1: descend stairs, at least eventually it could. One early demonstration 491 00:30:31,160 --> 00:30:35,000 Speaker 1: didn't go so well, and Asimo tripped, but it later 492 00:30:35,080 --> 00:30:38,480 Speaker 1: demonstrated those capabilities, and these were things that would be 493 00:30:38,520 --> 00:30:42,720 Speaker 1: built upon in future robotic projects, both at Honda and elsewhere. 494 00:30:42,880 --> 00:30:47,240 Speaker 1: I should also mention that Asimo was largely a programmed robot, 495 00:30:47,640 --> 00:30:50,360 Speaker 1: in that it would maneuver around and interact with an 496 00:30:50,480 --> 00:30:53,200 Speaker 1: environment that had been carefully mapped out for the robot. 497 00:30:53,520 --> 00:30:57,160 Speaker 1: So it's not like it was spontaneous. It wasn't coming 498 00:30:57,200 --> 00:31:00,360 Speaker 1: into a brand new environment finding its way or around 499 00:31:00,720 --> 00:31:03,440 Speaker 1: picking things up that kind of stuff. It was following 500 00:31:03,520 --> 00:31:06,920 Speaker 1: a very specific set of instructions and it knew where 501 00:31:06,960 --> 00:31:09,400 Speaker 1: everything was supposed to be and where it was supposed 502 00:31:09,400 --> 00:31:13,920 Speaker 1: to go. So it was not like an autonomous robot. 503 00:31:14,320 --> 00:31:17,120 Speaker 1: But that really wasn't what the project was about. It 504 00:31:17,160 --> 00:31:21,840 Speaker 1: wasn't about autonomy. It was about robotic locomotion and interacting 505 00:31:21,920 --> 00:31:25,040 Speaker 1: in human spaces. So you have to keep in mind 506 00:31:25,080 --> 00:31:30,840 Speaker 1: that this whole approach to robotics is multidisciplinary in nature. 507 00:31:31,240 --> 00:31:36,600 Speaker 1: It requires lots of different work in varying fields, some 508 00:31:36,680 --> 00:31:39,680 Speaker 1: of which aren't even in technology. I'll talk more about 509 00:31:39,720 --> 00:31:44,320 Speaker 1: that later. In twenty fifteen, the Defense Advanced Research Projects 510 00:31:44,400 --> 00:31:48,400 Speaker 1: Agency here in the United States aka DARPA held a 511 00:31:48,520 --> 00:31:53,120 Speaker 1: robotics challenge. So DARPA is known for contracting with various 512 00:31:53,200 --> 00:31:57,880 Speaker 1: companies and research facilities to develop bleeding edge technologies potentially 513 00:31:57,960 --> 00:32:01,080 Speaker 1: useful for the purposes of national defense. They're not always 514 00:32:01,360 --> 00:32:05,320 Speaker 1: couched in those specific terms, but that's the directive of DARPA. 515 00:32:05,440 --> 00:32:07,920 Speaker 1: DARPA has played a part in everything from the development 516 00:32:07,920 --> 00:32:11,400 Speaker 1: of the Internet to the early days of driverlest car technology. 517 00:32:11,680 --> 00:32:14,440 Speaker 1: Well in twenty fifteen, they had a lofty goal set 518 00:32:14,480 --> 00:32:17,280 Speaker 1: for robotics teams and it was all inspired by a 519 00:32:17,560 --> 00:32:23,200 Speaker 1: terrible disaster. So in twenty eleven, an earthquake and tsunami 520 00:32:23,480 --> 00:32:28,080 Speaker 1: damaged the Fukushima Daichi Nuclear Power Plant in Japan. The 521 00:32:28,440 --> 00:32:31,880 Speaker 1: power plants backup systems were damaged, and this led to 522 00:32:31,960 --> 00:32:35,160 Speaker 1: a situation in which the plant's reactors began to overheat 523 00:32:35,240 --> 00:32:37,880 Speaker 1: because there was no power that could be used to 524 00:32:37,920 --> 00:32:41,400 Speaker 1: operate the cooling system in order to keep everything under control, 525 00:32:41,680 --> 00:32:45,480 Speaker 1: and ultimately this overheating led to a containment failure and 526 00:32:45,640 --> 00:32:49,000 Speaker 1: radioactive material was released into the environment. It was the 527 00:32:49,040 --> 00:32:53,880 Speaker 1: worst nuclear disaster since Chernobyl. Cleaning up after the disaster 528 00:32:54,280 --> 00:32:57,480 Speaker 1: was a really dangerous job. Response workers would be subjected 529 00:32:57,480 --> 00:33:01,640 Speaker 1: to potentially dangerous levels of radiation or extended periods of time. 530 00:33:02,040 --> 00:33:05,160 Speaker 1: DARPA's challenge was to give robotics teams a set of 531 00:33:05,200 --> 00:33:08,320 Speaker 1: tasks that a robot would have to complete with minimal 532 00:33:08,400 --> 00:33:12,280 Speaker 1: direction and intervention from the teams. The idea being let's 533 00:33:12,360 --> 00:33:15,880 Speaker 1: work toward a technological solution where we could develop robots 534 00:33:15,960 --> 00:33:19,280 Speaker 1: that could step in into situations that were like Fukushima 535 00:33:19,640 --> 00:33:23,479 Speaker 1: and take the place of humans so that human beings 536 00:33:23,640 --> 00:33:26,440 Speaker 1: don't put their lives at risk to do this kind 537 00:33:26,480 --> 00:33:28,680 Speaker 1: of stuff. The robots, since they have no lives, they 538 00:33:28,720 --> 00:33:31,480 Speaker 1: could go and do it and we wouldn't be putting 539 00:33:31,520 --> 00:33:34,120 Speaker 1: any human life in jeopardy. That was the concept. But 540 00:33:34,160 --> 00:33:36,320 Speaker 1: to do this, these robots would have to do very 541 00:33:36,400 --> 00:33:39,040 Speaker 1: human like things, and they have to maneuver in a 542 00:33:39,160 --> 00:33:43,040 Speaker 1: very human like world because it was designed by humans, right, 543 00:33:43,640 --> 00:33:47,000 Speaker 1: so no big surprise there. So the robots would have 544 00:33:47,040 --> 00:33:49,960 Speaker 1: to do stuff like get in, to get out of 545 00:33:50,120 --> 00:33:53,200 Speaker 1: and operate a vehicle, to be able to open doors 546 00:33:53,360 --> 00:33:56,560 Speaker 1: and step through doorways, to pick up a tool and 547 00:33:56,640 --> 00:34:00,880 Speaker 1: to use it properly. And teams were given some but 548 00:34:01,040 --> 00:34:03,480 Speaker 1: not all, the information that they would actually need in 549 00:34:03,600 --> 00:34:06,560 Speaker 1: order to complete the various tasks that DARPA had laid out, 550 00:34:06,760 --> 00:34:10,440 Speaker 1: and the reason why they weren't given everything is because 551 00:34:10,480 --> 00:34:12,920 Speaker 1: the whole concept requires teams to build a robot that 552 00:34:12,960 --> 00:34:16,319 Speaker 1: could accomplish goals in a world that is unpredictable and 553 00:34:16,520 --> 00:34:19,600 Speaker 1: uncontrolled for the most part. In a real emergency, there 554 00:34:19,640 --> 00:34:22,560 Speaker 1: may be no way to account for all the variables, 555 00:34:22,800 --> 00:34:25,879 Speaker 1: and depending on the nature of the emergency, a team 556 00:34:26,000 --> 00:34:28,120 Speaker 1: might not be able to maintain a direct connection with 557 00:34:28,200 --> 00:34:30,439 Speaker 1: their robot, so the robot would need to be able 558 00:34:30,480 --> 00:34:33,960 Speaker 1: to handle some of this autonomously. Now, most of the 559 00:34:34,080 --> 00:34:38,000 Speaker 1: challenges were pretty darn straightforward, and they would have been 560 00:34:38,080 --> 00:34:41,000 Speaker 1: trivial for most people to be able to complete if 561 00:34:41,000 --> 00:34:43,279 Speaker 1: they were given the assignment. You know, if you ask 562 00:34:43,360 --> 00:34:46,640 Speaker 1: your typical human to get into a vehicle kind of 563 00:34:46,719 --> 00:34:49,960 Speaker 1: like a golf cart and to drive to a specific location, 564 00:34:50,239 --> 00:34:52,479 Speaker 1: to then get out of that vehicle, to open a door, 565 00:34:52,640 --> 00:34:55,600 Speaker 1: go through the door, pick up a power drill, drill 566 00:34:55,600 --> 00:34:58,920 Speaker 1: a hole in a wall, climb some stairs, and navigate 567 00:34:58,920 --> 00:35:01,960 Speaker 1: some rubble, and you'd likely see a lot of people 568 00:35:01,960 --> 00:35:05,160 Speaker 1: succeed at this. It's again a pretty simple set of 569 00:35:05,200 --> 00:35:08,120 Speaker 1: tasks for most people, that's not a tough gig, but 570 00:35:08,160 --> 00:35:11,879 Speaker 1: for robots, it's a totally different story. Now. There are 571 00:35:11,880 --> 00:35:15,160 Speaker 1: compilations of videos of the various teams participating in this 572 00:35:15,200 --> 00:35:18,719 Speaker 1: competition that show just how challenging it really was. There 573 00:35:18,719 --> 00:35:21,719 Speaker 1: are videos of robots that, upon trying to just walk 574 00:35:21,760 --> 00:35:27,000 Speaker 1: through a doorway, fall over. One robot completed the list 575 00:35:27,040 --> 00:35:31,440 Speaker 1: of tasks, turned wave to the crowd, then fell over 576 00:35:31,800 --> 00:35:35,440 Speaker 1: because that balance thing is really hard, y'all. Creating humanoid 577 00:35:35,520 --> 00:35:38,160 Speaker 1: robots that can interact with the technology that was made 578 00:35:38,160 --> 00:35:42,160 Speaker 1: for human beings requires a ton of consideration and cross 579 00:35:42,239 --> 00:35:48,040 Speaker 1: disciplinary work. For one thing, sometimes it requires roboticists to ask, hey, 580 00:35:48,760 --> 00:35:52,400 Speaker 1: why did we make this thing work in this specific way. 581 00:35:52,600 --> 00:35:55,279 Speaker 1: It's almost like you have to go through reverse engineering 582 00:35:55,320 --> 00:35:58,000 Speaker 1: the world around you in order to understand why things 583 00:35:58,040 --> 00:36:00,560 Speaker 1: are the way they are. That doesn't always end with 584 00:36:00,600 --> 00:36:02,880 Speaker 1: an answer that makes much sense. By the way, sometimes 585 00:36:02,920 --> 00:36:05,560 Speaker 1: we're like, wow, we should really change this because this 586 00:36:05,680 --> 00:36:07,640 Speaker 1: is not the best way to do it, but at 587 00:36:07,680 --> 00:36:10,799 Speaker 1: that point it's the established way to do it. Ultimately, 588 00:36:10,840 --> 00:36:13,680 Speaker 1: a team from the Republic of Korea won this competition. 589 00:36:13,920 --> 00:36:19,960 Speaker 1: The winning robot was named DRC dash Qubo Hubo. It 590 00:36:20,080 --> 00:36:23,720 Speaker 1: completed the series of tasks in just under forty five minutes, 591 00:36:24,080 --> 00:36:28,919 Speaker 1: and the team took home a two million dollar cash prize. Now, no, lie, 592 00:36:29,040 --> 00:36:31,600 Speaker 1: two million dollars, that's a lot of money, But I 593 00:36:31,600 --> 00:36:35,200 Speaker 1: would be willing to bet that the two million dollars 594 00:36:35,239 --> 00:36:38,600 Speaker 1: doesn't remotely cover the costs of all the research, development, 595 00:36:38,640 --> 00:36:41,160 Speaker 1: and production of the robot itself. I bet if you 596 00:36:41,160 --> 00:36:44,560 Speaker 1: were to add up all the expenses of making this robot, 597 00:36:44,760 --> 00:36:47,080 Speaker 1: it would be more than two million dollars. But the 598 00:36:47,120 --> 00:36:51,359 Speaker 1: money wasn't really the full goal of this thing. Like 599 00:36:51,400 --> 00:36:53,640 Speaker 1: that was an award, but you weren't doing it to 600 00:36:53,719 --> 00:36:56,879 Speaker 1: win the money. It's this challenge trying to figure out 601 00:36:56,880 --> 00:37:00,680 Speaker 1: a way to achieve this really tough goal set out 602 00:37:00,960 --> 00:37:03,879 Speaker 1: by the nature of the challenge itself. That's the real 603 00:37:04,000 --> 00:37:06,880 Speaker 1: call the engineers out there, know what I'm talking about. 604 00:37:06,920 --> 00:37:10,480 Speaker 1: Like that thrill of tackling a problem and figuring out 605 00:37:10,480 --> 00:37:13,839 Speaker 1: a solution. That's really what drives a lot of engineers. Now, 606 00:37:13,880 --> 00:37:17,400 Speaker 1: if anything, the challenge illustrated just how hard it is 607 00:37:17,440 --> 00:37:20,480 Speaker 1: to build a humanoid robot that can function properly. Now, 608 00:37:20,480 --> 00:37:22,560 Speaker 1: the benefits are pretty clear. You know, this kind of 609 00:37:22,640 --> 00:37:25,840 Speaker 1: robot could potentially step in during situations like the Fukushima 610 00:37:25,880 --> 00:37:28,520 Speaker 1: disaster scenarios in which a human would be put into 611 00:37:28,560 --> 00:37:32,200 Speaker 1: danger and the robot, due to its design, could interface 612 00:37:32,239 --> 00:37:35,840 Speaker 1: with systems that had been built for humans. That's understandably 613 00:37:35,880 --> 00:37:39,040 Speaker 1: a worthy goal. It's just a very challenging one. And 614 00:37:39,080 --> 00:37:42,560 Speaker 1: it gets harder when we start to bring artificial intelligence 615 00:37:42,600 --> 00:37:45,480 Speaker 1: into this because we've been mostly focused on things like locomotion. 616 00:37:45,760 --> 00:37:48,800 Speaker 1: But let's talk about AI. Now. I've told this story before, 617 00:37:48,840 --> 00:37:52,080 Speaker 1: but I went to a panel and about robotics. It 618 00:37:52,120 --> 00:37:54,600 Speaker 1: was at south By Southwest. This was several years ago, 619 00:37:54,840 --> 00:37:57,600 Speaker 1: and at that panel, the presenters were talking about how 620 00:37:57,960 --> 00:38:01,360 Speaker 1: challenging it is to teach robots how to do things, 621 00:38:01,600 --> 00:38:04,560 Speaker 1: like not program the robots to do it, but to 622 00:38:04,600 --> 00:38:07,399 Speaker 1: teach them how to learn in an environment and then 623 00:38:07,560 --> 00:38:10,640 Speaker 1: replicate things that they have learned. Like even when you 624 00:38:10,680 --> 00:38:13,440 Speaker 1: build models in which the robots are able to observe 625 00:38:13,560 --> 00:38:17,280 Speaker 1: and then attempt to replicate actions, stuff can go wrong. 626 00:38:17,719 --> 00:38:22,080 Speaker 1: Robots have the same limitations as other examples of machine learning. So, 627 00:38:22,200 --> 00:38:25,600 Speaker 1: for example, I have often used an example saying, like 628 00:38:25,680 --> 00:38:28,120 Speaker 1: teaching a computer to do something like recognize that an 629 00:38:28,160 --> 00:38:32,080 Speaker 1: image represents a specific object like a coffee mug is hard. 630 00:38:32,480 --> 00:38:35,120 Speaker 1: Not all coffee mugs are alike, right, Some come in 631 00:38:35,160 --> 00:38:39,480 Speaker 1: different sizes or shapes or colors. Some might have handles. 632 00:38:39,880 --> 00:38:42,719 Speaker 1: Those handles could be shaped one way versus another, some 633 00:38:42,920 --> 00:38:45,120 Speaker 1: might be you know, the pictures of them might be 634 00:38:45,160 --> 00:38:48,320 Speaker 1: in under different lighting conditions, or they might be paired 635 00:38:48,360 --> 00:38:51,800 Speaker 1: with other stuff that's of a similar size or shape 636 00:38:51,840 --> 00:38:55,520 Speaker 1: to the coffee mug. All of these elements represent challenging 637 00:38:55,640 --> 00:38:59,880 Speaker 1: variables to machines that are being trained to recognize images. 638 00:39:00,080 --> 00:39:03,120 Speaker 1: You know, the machines do not inherently understand what makes 639 00:39:03,160 --> 00:39:05,920 Speaker 1: a coffee mug a coffee mug. That's what you are 640 00:39:05,960 --> 00:39:08,480 Speaker 1: teaching them. And you know, you can teach a human 641 00:39:08,560 --> 00:39:11,040 Speaker 1: being what a coffee mug is and they pretty much 642 00:39:11,040 --> 00:39:13,040 Speaker 1: get it pretty darn quickly, even to the point where 643 00:39:13,080 --> 00:39:16,719 Speaker 1: they can recognize other coffee mugs that don't look exactly 644 00:39:16,840 --> 00:39:20,000 Speaker 1: like the initial one. But for computers it requires a 645 00:39:20,040 --> 00:39:22,800 Speaker 1: lot more training. You train your model, then you tweak 646 00:39:22,880 --> 00:39:25,400 Speaker 1: all the settings so that you can improve your results, 647 00:39:25,440 --> 00:39:28,560 Speaker 1: you know, cut down on the false positives and fix 648 00:39:28,600 --> 00:39:31,279 Speaker 1: all the mistakes, and you train it again, and you're 649 00:39:31,320 --> 00:39:35,120 Speaker 1: potentially using millions of images in order to do this. Now, 650 00:39:35,239 --> 00:39:38,920 Speaker 1: consider the humble door, now a door is a pretty 651 00:39:39,000 --> 00:39:41,480 Speaker 1: darn simple thing to operate for most people, but there 652 00:39:41,480 --> 00:39:44,239 Speaker 1: are lots of different ways that a door could potentially 653 00:39:44,280 --> 00:39:47,359 Speaker 1: operate right Like, the door might have a knob that 654 00:39:47,400 --> 00:39:49,520 Speaker 1: you have to twist. It might have a bar that 655 00:39:49,600 --> 00:39:51,560 Speaker 1: you have to press, or a handle that you have 656 00:39:51,640 --> 00:39:54,359 Speaker 1: to pull. So when you encounter a door, chances are 657 00:39:54,440 --> 00:39:57,759 Speaker 1: you have a decent idea of how it works, but 658 00:39:57,840 --> 00:39:59,880 Speaker 1: you might not know which way it opens it for 659 00:40:00,719 --> 00:40:02,360 Speaker 1: that can give you a little bit of a pause. 660 00:40:02,600 --> 00:40:05,880 Speaker 1: One of my favorite of Gary Larson's Far Side cartoons 661 00:40:05,920 --> 00:40:09,239 Speaker 1: a classic cartoon strip from the eighties. It shows a 662 00:40:09,280 --> 00:40:12,440 Speaker 1: young boy pushing as hard as he possibly can on 663 00:40:12,520 --> 00:40:14,719 Speaker 1: a door, and just above his hand on the door 664 00:40:14,840 --> 00:40:17,640 Speaker 1: is a label that reads pull, and next to the 665 00:40:17,719 --> 00:40:21,400 Speaker 1: doorway is a sign that reads Midvale School for the Gifted. 666 00:40:21,719 --> 00:40:25,520 Speaker 1: I feel this cartoon in my soul some days. Well. 667 00:40:25,600 --> 00:40:28,080 Speaker 1: Robots are kind of like that student. Even a robot 668 00:40:28,080 --> 00:40:30,760 Speaker 1: that's been trained to open doors might need to pause 669 00:40:30,800 --> 00:40:33,920 Speaker 1: and have a digital think about things before giving it 670 00:40:34,000 --> 00:40:37,040 Speaker 1: an attempt. So the south By Southwest panelist was telling 671 00:40:37,080 --> 00:40:39,919 Speaker 1: the story of such a robot, and this robot sat 672 00:40:39,960 --> 00:40:42,319 Speaker 1: outside a door in a hallway. I think It was 673 00:40:42,400 --> 00:40:46,640 Speaker 1: an electrical engineering department at the university that this panelist 674 00:40:46,640 --> 00:40:49,200 Speaker 1: worked at, and it was sat there for several days 675 00:40:49,440 --> 00:40:53,160 Speaker 1: just contemplating the door, and it was really irritating folks 676 00:40:53,200 --> 00:40:55,600 Speaker 1: who worked there because they had to walk around this 677 00:40:55,680 --> 00:40:58,000 Speaker 1: thing in order to get through the hallway, and if 678 00:40:58,040 --> 00:40:59,960 Speaker 1: they passed in front of it, it irritated the row 679 00:41:00,080 --> 00:41:04,040 Speaker 1: bodicists because it could actually disrupt the process and set 680 00:41:04,080 --> 00:41:06,839 Speaker 1: things back even further. But the robot was just trying 681 00:41:06,880 --> 00:41:09,520 Speaker 1: to figure out how it should proceed in order to 682 00:41:09,520 --> 00:41:11,400 Speaker 1: try and open the door. And when you think about 683 00:41:11,400 --> 00:41:14,440 Speaker 1: how a robot could potentially be powerful enough to cause 684 00:41:14,560 --> 00:41:17,680 Speaker 1: damage to the environment it's in if it attempts to 685 00:41:17,760 --> 00:41:21,360 Speaker 1: do something incorrectly, then you start to understand why taking 686 00:41:21,600 --> 00:41:24,719 Speaker 1: time might actually be a necessity. It might be an 687 00:41:24,719 --> 00:41:28,759 Speaker 1: important thing to build into robots. It seems ridiculous to 688 00:41:28,880 --> 00:41:30,960 Speaker 1: just stare at a door for days on end before 689 00:41:31,040 --> 00:41:34,520 Speaker 1: even attempting to open it, but if you could potentially 690 00:41:34,960 --> 00:41:37,480 Speaker 1: rip the handle off the door or damage the door 691 00:41:37,480 --> 00:41:41,360 Speaker 1: in some way, well, taking time might be a needed precaution. 692 00:41:42,160 --> 00:41:45,120 Speaker 1: And that's not even getting into the challenge of having 693 00:41:45,280 --> 00:41:48,640 Speaker 1: robots that operate within an environment in which they're also 694 00:41:48,719 --> 00:41:51,799 Speaker 1: human beings. Obviously, you have to take a lot into 695 00:41:51,800 --> 00:41:55,640 Speaker 1: consideration in those kinds of situations where robots and human 696 00:41:55,640 --> 00:41:58,280 Speaker 1: beings are going to be working within the same environment. 697 00:41:58,400 --> 00:42:02,400 Speaker 1: Like in most industrial uses for robots, the robots are 698 00:42:03,080 --> 00:42:06,279 Speaker 1: very much separated from all the people, like there are 699 00:42:06,520 --> 00:42:09,840 Speaker 1: multiple safety considerations put in place to keep the robots 700 00:42:09,880 --> 00:42:13,759 Speaker 1: and people away from each other because the potential for 701 00:42:14,000 --> 00:42:16,600 Speaker 1: catastrophe is way too high. If you're working too close 702 00:42:16,600 --> 00:42:19,280 Speaker 1: to a robot, like it's a robot that welds stuff 703 00:42:19,400 --> 00:42:21,359 Speaker 1: or whatever. Well, you know you don't want to get 704 00:42:21,400 --> 00:42:24,719 Speaker 1: in the way of a welding, right, That would be awful, 705 00:42:24,520 --> 00:42:28,439 Speaker 1: It would potentially be deadly. So creating robots that are 706 00:42:28,560 --> 00:42:32,279 Speaker 1: capable of interacting among human beings it comes with its 707 00:42:32,280 --> 00:42:35,400 Speaker 1: own series of challenges you have to overcome. So not 708 00:42:35,480 --> 00:42:37,640 Speaker 1: only must the robot be able to move around without 709 00:42:37,719 --> 00:42:40,680 Speaker 1: falling over onto somebody, it needs to be able to 710 00:42:40,840 --> 00:42:43,839 Speaker 1: do this in a way that doesn't cause anxiety or 711 00:42:43,920 --> 00:42:47,560 Speaker 1: fear or other negative reactions among the human beings. Really 712 00:42:47,560 --> 00:42:49,920 Speaker 1: weird thing is that sometimes a robot can behave a 713 00:42:49,920 --> 00:42:52,240 Speaker 1: little too human and that can end up being almost 714 00:42:52,239 --> 00:42:54,719 Speaker 1: as bad as if it's not acting human enough. You've 715 00:42:54,719 --> 00:42:57,279 Speaker 1: got to find a balance, Like there are expectations that 716 00:42:57,360 --> 00:42:59,720 Speaker 1: humans have when it comes to interacting with the robots. 717 00:43:00,040 --> 00:43:04,000 Speaker 1: Of the robots behave too far outside that set of expectations, 718 00:43:04,200 --> 00:43:08,520 Speaker 1: it can cause issues. So robotics has become a truly 719 00:43:08,600 --> 00:43:14,239 Speaker 1: multidisciplinary endeavor. Making a bipedal humanoid robot capable of integrating 720 00:43:14,280 --> 00:43:17,160 Speaker 1: with humans the way the Tesla Optimist robot is supposed 721 00:43:17,160 --> 00:43:20,960 Speaker 1: to do that requires lots of work in disciplines that 722 00:43:21,040 --> 00:43:24,800 Speaker 1: go well outside of technology. We're talking about stuff like psychology, 723 00:43:25,120 --> 00:43:28,040 Speaker 1: and I think every time we see a remarkable achievement 724 00:43:28,239 --> 00:43:31,440 Speaker 1: in the robotics space, we're also reminded how far we 725 00:43:31,480 --> 00:43:34,359 Speaker 1: still have to go and how hard this really is. 726 00:43:34,880 --> 00:43:39,919 Speaker 1: So will Tesla's Optimist robots deliver upon all the promises 727 00:43:40,000 --> 00:43:46,440 Speaker 1: that Elon Musk often quotes at these events. Maybe I'm skeptical, 728 00:43:46,560 --> 00:43:49,080 Speaker 1: largely because Elon Musk has proven to make some rather 729 00:43:49,120 --> 00:43:52,799 Speaker 1: ambitious claims the past that have failed to manifest as described. 730 00:43:52,920 --> 00:43:56,160 Speaker 1: He's kind of the boy who called fully autonomous driving. 731 00:43:56,480 --> 00:43:59,520 Speaker 1: And largely I also have doubts because I have an 732 00:43:59,520 --> 00:44:02,120 Speaker 1: inkling as to how hard it's going to be to 733 00:44:02,160 --> 00:44:06,600 Speaker 1: make a bipedal general purpose robot that's at least as 734 00:44:06,680 --> 00:44:10,040 Speaker 1: good as, and hopefully better than a human being at 735 00:44:10,040 --> 00:44:13,160 Speaker 1: doing your typical tasks. If the robot is worse at 736 00:44:13,239 --> 00:44:15,759 Speaker 1: doing those tasks, then it's a waste of time and 737 00:44:15,840 --> 00:44:19,319 Speaker 1: money to use the robot. Just hire somebody else to 738 00:44:19,360 --> 00:44:23,600 Speaker 1: do it, it makes more sense. So like these are 739 00:44:23,680 --> 00:44:26,319 Speaker 1: these are really high barriers that you have to you 740 00:44:26,320 --> 00:44:28,480 Speaker 1: have to get over, and I don't think we're going 741 00:44:28,560 --> 00:44:30,680 Speaker 1: to get over them very quickly. I think it's going 742 00:44:30,760 --> 00:44:33,719 Speaker 1: to take years and years more work. But it is 743 00:44:33,800 --> 00:44:35,960 Speaker 1: a heck of a goal to aim for. I don't 744 00:44:35,960 --> 00:44:39,879 Speaker 1: want to shame anyone for taking aim at achieving this 745 00:44:40,239 --> 00:44:45,320 Speaker 1: really difficult task because it drives innovation. I think that's important. 746 00:44:45,520 --> 00:44:49,320 Speaker 1: So I don't want to dismiss anyone who's working toward 747 00:44:49,719 --> 00:44:54,320 Speaker 1: building bipedal humanoid general purpose robots that have a level 748 00:44:54,320 --> 00:44:57,719 Speaker 1: of AI that allow them to operate autonomously within a 749 00:44:57,800 --> 00:45:00,839 Speaker 1: human environment. I think that that is a phenomen I 750 00:45:01,000 --> 00:45:04,400 Speaker 1: just think it's also one that's going to require many 751 00:45:04,480 --> 00:45:09,040 Speaker 1: more years of work for it to be a viable project. Right, Like, 752 00:45:09,400 --> 00:45:13,880 Speaker 1: I guess I could see a disappointing version becoming a 753 00:45:13,920 --> 00:45:17,040 Speaker 1: reality within a couple of years. But that's really falling 754 00:45:17,160 --> 00:45:20,439 Speaker 1: fall short of the promise, and I would much rather 755 00:45:20,520 --> 00:45:26,120 Speaker 1: see more work being done to improve the technology than 756 00:45:26,239 --> 00:45:30,319 Speaker 1: for a premature release of some humanoid robot that just 757 00:45:30,360 --> 00:45:33,759 Speaker 1: doesn't do anything well enough to justify its existence. That 758 00:45:33,800 --> 00:45:36,280 Speaker 1: would really take a lot of wind out of the sales. 759 00:45:36,320 --> 00:45:39,200 Speaker 1: I think, all right, that's it for this episode of 760 00:45:39,320 --> 00:45:42,919 Speaker 1: tech Stuff. I hope you're all well, and I'll talk 761 00:45:42,960 --> 00:45:53,400 Speaker 1: to you again really soon. Tech Stuff is an iHeartRadio production. 762 00:45:53,680 --> 00:45:58,719 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 763 00:45:58,840 --> 00:46:00,840 Speaker 1: or wherever you listen to your favorite shows.