1 00:00:15,356 --> 00:00:23,156 Speaker 1: Pushkin. So how long have you been trying to make 2 00:00:23,196 --> 00:00:23,916 Speaker 1: a robot walk? 3 00:00:25,396 --> 00:00:27,796 Speaker 2: It's been my entire career, starting from why I went 4 00:00:27,836 --> 00:00:29,396 Speaker 2: to college in the first place. 5 00:00:29,636 --> 00:00:34,996 Speaker 1: Why why that particular problem? Why is that your life's work? 6 00:00:35,556 --> 00:00:38,436 Speaker 2: You know, there's few things more interesting and more dynamically 7 00:00:38,436 --> 00:00:41,436 Speaker 2: complex and more elegant than the way animals move in 8 00:00:41,476 --> 00:00:43,876 Speaker 2: the world. And to be able to get machines that 9 00:00:43,996 --> 00:00:46,436 Speaker 2: can move that way, they can interact physically with the 10 00:00:46,436 --> 00:00:49,996 Speaker 2: world the way humans and animals do. What a fun 11 00:00:50,076 --> 00:00:51,796 Speaker 2: and interesting thing to work on for a career. 12 00:00:57,756 --> 00:01:00,076 Speaker 1: I'm Jacob Goldstein and this is What's Your Problem? The 13 00:01:00,116 --> 00:01:02,036 Speaker 1: show where I talk to people who are trying to 14 00:01:02,116 --> 00:01:06,796 Speaker 1: make technological progress. My guest today is Jonathan Hurst. He's 15 00:01:06,796 --> 00:01:10,236 Speaker 1: a professor at Oregon State University and founder and chief 16 00:01:10,436 --> 00:01:15,596 Speaker 1: robot officer at Agility Robotics. Agility Robotics has built a 17 00:01:15,716 --> 00:01:19,196 Speaker 1: robot called Digit. Digit looks kind of like a person. 18 00:01:19,316 --> 00:01:21,996 Speaker 1: It walks around on two legs. It's got this flat, 19 00:01:21,996 --> 00:01:24,596 Speaker 1: rectangular head, and it has two arms that it can 20 00:01:24,676 --> 00:01:28,836 Speaker 1: use to pick stuff up. Jonathan's problem is this, how 21 00:01:28,876 --> 00:01:31,636 Speaker 1: can you make a walking robot that can do useful 22 00:01:31,676 --> 00:01:36,316 Speaker 1: work and that companies will actually pay for. Jonathan says 23 00:01:36,356 --> 00:01:40,276 Speaker 1: that robot Digit is already being tested out in warehouses 24 00:01:40,356 --> 00:01:41,276 Speaker 1: in the real world. 25 00:01:43,956 --> 00:01:46,676 Speaker 2: We are deploying robots with cost We have two announced. 26 00:01:46,676 --> 00:01:49,596 Speaker 2: We've announced a couple with Amazon and with GXO. You know, 27 00:01:50,036 --> 00:01:52,996 Speaker 2: you place an order and Digit handles that order as 28 00:01:53,036 --> 00:01:56,516 Speaker 2: part of the workflow that has happened is happening right now, and. 29 00:01:56,436 --> 00:02:02,876 Speaker 1: So specifically, what are your robots doing well? First at Amazon, Yeah, 30 00:02:02,876 --> 00:02:05,636 Speaker 1: the first use case or the first class of use 31 00:02:05,676 --> 00:02:08,836 Speaker 1: cases that works for us is picking up these plastic 32 00:02:08,876 --> 00:02:11,996 Speaker 1: tote plastic bins and putting them somewhere else. And then 33 00:02:12,556 --> 00:02:18,316 Speaker 1: in warehouses, Yeah, but anywhere warehouses, in logistics, in manufacturing, 34 00:02:18,436 --> 00:02:20,796 Speaker 1: you know, the whole environments like that that are a 35 00:02:20,796 --> 00:02:24,236 Speaker 1: bit structured. They're kind of like there's these islands of 36 00:02:24,276 --> 00:02:27,156 Speaker 1: automation they call them, you know, where one machine is 37 00:02:27,196 --> 00:02:30,076 Speaker 1: putting things in a bin, another machine sorts bins and 38 00:02:30,116 --> 00:02:33,596 Speaker 1: sends them different parts of the warehouse. Right now, sometimes 39 00:02:33,636 --> 00:02:36,156 Speaker 1: a person will stand there, the robot will tell them 40 00:02:36,156 --> 00:02:38,036 Speaker 1: which bin to pick up, and then all they are 41 00:02:38,036 --> 00:02:41,156 Speaker 1: basically is a manipulator for the robots system. They pick 42 00:02:41,236 --> 00:02:42,596 Speaker 1: up the bin and put it on the conveyor belt 43 00:02:42,636 --> 00:02:44,076 Speaker 1: and wait for the robot to tell them the next 44 00:02:44,076 --> 00:02:46,836 Speaker 1: thing to do. And it's really hard to hire people 45 00:02:46,876 --> 00:02:50,116 Speaker 1: for that. There are a lot of open jobs in that, 46 00:02:50,276 --> 00:02:52,596 Speaker 1: and so it's kind of a perfect place for Digit 47 00:02:52,676 --> 00:02:56,996 Speaker 1: to walk in in this relatively structured first use case. Now, 48 00:02:57,116 --> 00:03:01,236 Speaker 1: digits of course going to evolve towards picking up boxes, depalletizing, 49 00:03:01,636 --> 00:03:04,876 Speaker 1: loading and unloading you know, tractor trailers and eventually getting 50 00:03:04,876 --> 00:03:08,636 Speaker 1: out to things like retail and stocking shelves and you 51 00:03:08,676 --> 00:03:12,036 Speaker 1: know sticks in hospitals carrying things around and eventually become 52 00:03:12,036 --> 00:03:15,876 Speaker 1: a consumer product. So that's where you are today and 53 00:03:15,916 --> 00:03:18,356 Speaker 1: where you want to get. I want to talk now 54 00:03:18,436 --> 00:03:23,916 Speaker 1: about how you got here. Sure, and there's this really 55 00:03:24,356 --> 00:03:27,836 Speaker 1: basic set of things you had to figure out just 56 00:03:28,036 --> 00:03:33,876 Speaker 1: around how locomotion works, right, how people walk, also interestingly 57 00:03:33,916 --> 00:03:37,676 Speaker 1: and sort of surprisingly, how how birds like ostriches walk. 58 00:03:37,996 --> 00:03:42,076 Speaker 1: So I know there is this series of robots that 59 00:03:42,116 --> 00:03:45,036 Speaker 1: you built on the way on the way to Digit, right, 60 00:03:45,196 --> 00:03:47,756 Speaker 1: they were two before and then Digit and it seems 61 00:03:47,796 --> 00:03:50,396 Speaker 1: like going through those and what you figured out the 62 00:03:50,476 --> 00:03:53,716 Speaker 1: sort of key insight on each one is a really 63 00:03:53,836 --> 00:03:57,236 Speaker 1: nice way to get a kind of deeper insight into 64 00:03:57,796 --> 00:03:59,716 Speaker 1: how it works and what you had to understand to 65 00:03:59,756 --> 00:04:02,356 Speaker 1: make a robot that can walk great, just being a 66 00:04:02,396 --> 00:04:05,556 Speaker 1: really hard problem, right, Like there's like lots of robots 67 00:04:05,556 --> 00:04:08,116 Speaker 1: that we don't even sort of think of as robots arms, 68 00:04:08,156 --> 00:04:11,396 Speaker 1: and you know, self driving cars are arguably a kind 69 00:04:11,396 --> 00:04:13,716 Speaker 1: of robot and whatever, but like getting a robot to 70 00:04:13,756 --> 00:04:16,556 Speaker 1: walk is clearly a very hard problem. 71 00:04:16,716 --> 00:04:20,836 Speaker 2: Yeah, Okay, So where I started in trying to understand 72 00:04:20,916 --> 00:04:24,516 Speaker 2: how to make a machine work is on the biomechanics, 73 00:04:24,756 --> 00:04:27,156 Speaker 2: try to understand how animals work. And a lot of 74 00:04:27,156 --> 00:04:30,596 Speaker 2: biomechanics is about specific muscles and muscle groups and joints, 75 00:04:30,636 --> 00:04:33,836 Speaker 2: and we're instead looking at holistically from a big picture, 76 00:04:34,196 --> 00:04:36,396 Speaker 2: what is the center of mass of an animal doing? 77 00:04:36,436 --> 00:04:38,956 Speaker 2: What are the forces happening on the ground. I did 78 00:04:38,956 --> 00:04:42,356 Speaker 2: this collaboration with Monica Daily at the Royal Veterinary College 79 00:04:42,436 --> 00:04:45,356 Speaker 2: and we looked at guinea fowl and ostriches and turkeys 80 00:04:45,396 --> 00:04:47,636 Speaker 2: and you know, a whole bunch of different sizes. 81 00:04:47,876 --> 00:04:51,036 Speaker 1: Why birds. It's really interesting to me that you did that. 82 00:04:51,196 --> 00:04:55,156 Speaker 2: Why because humans are weird? How many other bipeds are there? 83 00:04:55,236 --> 00:04:58,436 Speaker 2: You know, all the evolved bipeds, all of the extant 84 00:04:58,556 --> 00:05:00,836 Speaker 2: existing bipeds in the world. 85 00:05:00,756 --> 00:05:02,836 Speaker 1: Everything that walks on two legs. Yeah, huh. 86 00:05:03,236 --> 00:05:07,276 Speaker 2: Those are all theropods. They're all kind of more like 87 00:05:07,316 --> 00:05:10,756 Speaker 2: a bird than they are like or a monkey. They 88 00:05:10,796 --> 00:05:14,076 Speaker 2: have evolved far longer than we have. You know, we've 89 00:05:14,076 --> 00:05:16,996 Speaker 2: been out of trees for a very short period of time. 90 00:05:17,036 --> 00:05:19,756 Speaker 2: Compared to all of the other existing bipeds in the world. 91 00:05:20,116 --> 00:05:22,636 Speaker 2: They can all run much faster than we can for 92 00:05:22,956 --> 00:05:27,836 Speaker 2: you know, very efficiently using less energy insul So. 93 00:05:27,756 --> 00:05:31,316 Speaker 1: Like an ostrich is a better model for just abstract 94 00:05:31,396 --> 00:05:34,836 Speaker 1: walking on two legs. Oh sure, I mean, yeah, I 95 00:05:34,876 --> 00:05:36,516 Speaker 1: love that, But I want to be clear. 96 00:05:36,556 --> 00:05:38,636 Speaker 2: What we're not trying to do is study how it 97 00:05:38,636 --> 00:05:40,636 Speaker 2: does an Ostrich run versus how a human runs. What 98 00:05:40,636 --> 00:05:43,196 Speaker 2: we're trying to do is study what is a fundamental 99 00:05:43,236 --> 00:05:46,476 Speaker 2: truth between all animals in how they run, so we 100 00:05:46,516 --> 00:05:49,316 Speaker 2: can try and weed out things that have to do 101 00:05:49,396 --> 00:05:51,596 Speaker 2: with the size of the animal, or things that have 102 00:05:51,676 --> 00:05:54,036 Speaker 2: to do with where exactly the ankle joint is or 103 00:05:54,036 --> 00:05:56,916 Speaker 2: how long this joint is or that one. We want 104 00:05:56,956 --> 00:06:00,476 Speaker 2: to know what is the same amongst all animals to 105 00:06:00,636 --> 00:06:01,156 Speaker 2: walk and run. 106 00:06:01,196 --> 00:06:04,556 Speaker 1: It's like the Platonic ideal of bipedalism. The sort of 107 00:06:04,556 --> 00:06:06,956 Speaker 1: the fundrationing theory of it. Yeah, that's right. 108 00:06:07,076 --> 00:06:09,636 Speaker 2: Yeah, And so biomechanists have been looking at this since 109 00:06:09,676 --> 00:06:11,996 Speaker 2: the seventies and thinking about this in terms of spring 110 00:06:12,036 --> 00:06:14,116 Speaker 2: mass models of locomotion. And it was only in like 111 00:06:14,156 --> 00:06:16,356 Speaker 2: the two thousands sometimes that I think it was Hart 112 00:06:16,396 --> 00:06:19,596 Speaker 2: McGuire and Andre Seifarth put together a paper that showed, hey, 113 00:06:19,956 --> 00:06:23,876 Speaker 2: this spring mass model reproduces all of the behavior we 114 00:06:23,956 --> 00:06:27,116 Speaker 2: see for walking and running and transitions between these gates. 115 00:06:27,756 --> 00:06:30,396 Speaker 1: When you say, when you say spring mass model, that 116 00:06:30,436 --> 00:06:32,876 Speaker 1: sounds big and exciting, But just to be clear, what 117 00:06:32,916 --> 00:06:35,036 Speaker 1: do you mean when you say spring mass model. 118 00:06:35,036 --> 00:06:41,236 Speaker 2: I mean a mathematical representation of a pogo stick. Go on, Okay, 119 00:06:41,436 --> 00:06:43,796 Speaker 2: So a pogo stick is basically the simplest thing that 120 00:06:43,876 --> 00:06:46,756 Speaker 2: can run, and a kangaroo looks a lot like a 121 00:06:46,756 --> 00:06:49,756 Speaker 2: pogo sticky. Now, if you just stick a pogo stick 122 00:06:49,796 --> 00:06:52,276 Speaker 2: on each leg, now you're bipedal running, you know. 123 00:06:52,356 --> 00:06:52,636 Speaker 1: Okay. 124 00:06:53,036 --> 00:06:54,836 Speaker 2: And then if you add a whole bunch of complexity 125 00:06:54,836 --> 00:06:57,516 Speaker 2: to it, you have heel toe and you know, knee 126 00:06:57,556 --> 00:06:59,836 Speaker 2: joints and all this other stuff. But if you really 127 00:06:59,876 --> 00:07:01,556 Speaker 2: boil it down and try and make it as simple 128 00:07:01,556 --> 00:07:04,036 Speaker 2: as possible, you get to some pretty basic math models 129 00:07:04,036 --> 00:07:07,356 Speaker 2: that do represent how the progression of the center of 130 00:07:07,436 --> 00:07:10,196 Speaker 2: mass of the animal moves, and how the ground reaction 131 00:07:10,356 --> 00:07:11,796 Speaker 2: forces progress and so on. 132 00:07:11,916 --> 00:07:14,956 Speaker 1: Right, Okay, so if I picture just like a lump 133 00:07:14,956 --> 00:07:17,476 Speaker 1: of mass on top of two pogo sticks, you got it. 134 00:07:17,516 --> 00:07:18,276 Speaker 1: I'm kind of in the right. 135 00:07:18,356 --> 00:07:21,556 Speaker 2: Okay, you're absolutely okay. So that's a roughly a math 136 00:07:21,596 --> 00:07:23,716 Speaker 2: model that at least gives you a very good concept 137 00:07:23,756 --> 00:07:28,036 Speaker 2: of how do all animals run, horses, ghost, crabs, humans, ostriches, whatever. 138 00:07:28,356 --> 00:07:31,316 Speaker 1: Right, So this paper comes out, this paper that says, 139 00:07:32,036 --> 00:07:33,836 Speaker 1: think of a lump of mass on top of two 140 00:07:33,876 --> 00:07:37,436 Speaker 1: pogo sticks. You you know about it because you're in 141 00:07:37,476 --> 00:07:39,876 Speaker 1: this world? What do you What effect does it have 142 00:07:39,916 --> 00:07:41,516 Speaker 1: on you? What do you do as a result of this. 143 00:07:42,116 --> 00:07:44,036 Speaker 2: What I was looking at? And here's the argument at 144 00:07:44,036 --> 00:07:50,036 Speaker 2: the time. Do these spring mass models simply seem to 145 00:07:50,076 --> 00:07:53,316 Speaker 2: describe the things we're observing, you know? Or is it 146 00:07:53,396 --> 00:07:56,476 Speaker 2: describing core physics of how it works? Like, in other words, 147 00:07:56,676 --> 00:07:58,956 Speaker 2: if you build a spring mass model and a policy 148 00:07:58,996 --> 00:08:00,916 Speaker 2: that works, is it going to make a robot stabilize? 149 00:08:01,116 --> 00:08:04,156 Speaker 2: Or is it simply like a picture that kind of 150 00:08:04,196 --> 00:08:05,796 Speaker 2: looks like what the animals are actually doing. 151 00:08:05,876 --> 00:08:08,316 Speaker 1: So like if we actually do the lump of mass 152 00:08:08,316 --> 00:08:10,636 Speaker 1: on top of two pot sticks as a robot, will 153 00:08:10,676 --> 00:08:11,036 Speaker 1: it work? 154 00:08:11,436 --> 00:08:13,596 Speaker 2: Yeah? And so look the question is, you know, how 155 00:08:13,636 --> 00:08:16,036 Speaker 2: do you control these things? And then how does that 156 00:08:16,076 --> 00:08:18,716 Speaker 2: translate into walking and running which had never really been 157 00:08:18,716 --> 00:08:21,476 Speaker 2: done before that way, you haven't done this continuous transition 158 00:08:21,556 --> 00:08:23,996 Speaker 2: between walking and running and changing the speed and everything else, 159 00:08:24,356 --> 00:08:26,676 Speaker 2: and it's unknown how to stabilize that over all kinds 160 00:08:26,716 --> 00:08:29,436 Speaker 2: of terrain. So that's why we built Atreus. That's why 161 00:08:29,476 --> 00:08:30,676 Speaker 2: we built the robot Atreus. 162 00:08:30,836 --> 00:08:31,996 Speaker 1: And so what is Atreus? 163 00:08:32,396 --> 00:08:35,076 Speaker 2: So Atreus is it is a bipedal robot. 164 00:08:35,196 --> 00:08:37,076 Speaker 1: Okay, what does Atrius look like? 165 00:08:38,636 --> 00:08:41,556 Speaker 2: So Atrius was on the Late Night with Stephen Colbert 166 00:08:41,796 --> 00:08:44,036 Speaker 2: and he described it as a microwave on stilts. 167 00:08:44,676 --> 00:08:51,076 Speaker 1: Okay, that's good. So it doesn't it doesn't look humanoid 168 00:08:51,076 --> 00:08:53,836 Speaker 1: at all. It looks maybe like a dancing alien or 169 00:08:53,916 --> 00:08:57,996 Speaker 1: like a moon lander or something, but not humanoid. 170 00:08:58,436 --> 00:09:00,916 Speaker 2: Yeah, absolutely not. So it doesn't look like an animal 171 00:09:00,996 --> 00:09:03,756 Speaker 2: in any way. But it's designed entirely to be the 172 00:09:03,756 --> 00:09:06,276 Speaker 2: math model of what we see an animal running. And 173 00:09:06,516 --> 00:09:09,436 Speaker 2: the name Atrius is an acronym for assume the robot 174 00:09:09,516 --> 00:09:12,476 Speaker 2: is a sphere, right. The whole idea is the robot 175 00:09:12,556 --> 00:09:13,796 Speaker 2: is this simple math model. 176 00:09:14,316 --> 00:09:16,356 Speaker 1: Where it's just some mass in the middle and then 177 00:09:16,396 --> 00:09:18,356 Speaker 1: some very light springy legs. 178 00:09:18,676 --> 00:09:21,796 Speaker 2: That's it. And so what we ultimately showed with Atreus 179 00:09:21,836 --> 00:09:25,916 Speaker 2: it's the first robot ever to reproduce human walking gate dynamics. 180 00:09:25,996 --> 00:09:28,756 Speaker 2: The robot walks across the force plate, a graduate student 181 00:09:28,756 --> 00:09:31,036 Speaker 2: walks across the forest plate. Looking at the data, you 182 00:09:31,076 --> 00:09:32,156 Speaker 2: can't actually tell a difference. 183 00:09:32,276 --> 00:09:35,596 Speaker 1: Huh. So, like from the plate's point of view, it 184 00:09:35,676 --> 00:09:38,236 Speaker 1: feels the same whether you're robot or a human is 185 00:09:38,276 --> 00:09:38,916 Speaker 1: walking across. 186 00:09:38,916 --> 00:09:39,476 Speaker 2: That's correct. 187 00:09:39,676 --> 00:09:41,956 Speaker 1: So it's not true for earlier robots that looked like 188 00:09:41,996 --> 00:09:43,356 Speaker 1: they were walking like humans. 189 00:09:43,076 --> 00:09:44,956 Speaker 2: They would look the same. But if you looked at 190 00:09:44,996 --> 00:09:47,236 Speaker 2: the dynamics of it, if you looked at the ground 191 00:09:47,276 --> 00:09:49,276 Speaker 2: reaction forces, they differed quite a lot. 192 00:09:49,796 --> 00:09:52,196 Speaker 1: Why is that important? Why is that important? 193 00:09:52,356 --> 00:09:55,396 Speaker 2: It's just one symptom, right to show that, hey, we've 194 00:09:55,436 --> 00:09:58,316 Speaker 2: actually captured the physics here. But the other symptom that's 195 00:09:58,356 --> 00:10:01,116 Speaker 2: important is we are able to walk and run outdoors 196 00:10:01,196 --> 00:10:04,156 Speaker 2: over all kinds of terrain without any sort of perception. 197 00:10:04,556 --> 00:10:07,476 Speaker 2: The robot can handle amazing obstacles and it would just 198 00:10:07,516 --> 00:10:10,236 Speaker 2: soak them up, you know, going over potholes, going from 199 00:10:10,276 --> 00:10:13,556 Speaker 2: grass to pavement, going over big pieces of plywood we 200 00:10:13,596 --> 00:10:14,556 Speaker 2: would throw in its way. 201 00:10:15,076 --> 00:10:16,756 Speaker 1: And you're saying it didn't do this because it had 202 00:10:16,796 --> 00:10:18,956 Speaker 1: like a clever brain. You're saying it was just the 203 00:10:18,956 --> 00:10:22,396 Speaker 1: physics of the brainless machine was able to deal. 204 00:10:23,756 --> 00:10:26,596 Speaker 2: No cameras on it, It had no awareness of the environment. 205 00:10:26,756 --> 00:10:29,756 Speaker 2: It was a very simple spring mass model, very very 206 00:10:29,756 --> 00:10:32,276 Speaker 2: simple control that did nothing but try to balance that. 207 00:10:32,596 --> 00:10:34,596 Speaker 2: And it was able to just absorb all these kinds 208 00:10:34,596 --> 00:10:36,276 Speaker 2: of disturbances and just keep on going. 209 00:10:36,716 --> 00:10:38,796 Speaker 1: That sounds like a big deal. I agree. 210 00:10:39,356 --> 00:10:42,476 Speaker 2: I'm very excited about this, right. That is the point 211 00:10:42,476 --> 00:10:45,036 Speaker 2: that we decided we were going to found Agility Robotics. 212 00:10:45,476 --> 00:10:47,356 Speaker 2: We said, you know, this was a mission for like 213 00:10:47,796 --> 00:10:50,556 Speaker 2: years and years and years. That is why I became 214 00:10:50,596 --> 00:10:54,716 Speaker 2: a professor, is to say, my goal here in academia 215 00:10:55,276 --> 00:10:58,916 Speaker 2: is to show that this spring mass physics is real 216 00:10:59,316 --> 00:11:02,236 Speaker 2: and really make sure we understand that well. And if 217 00:11:02,276 --> 00:11:04,156 Speaker 2: we can show that and prove that and with this 218 00:11:04,276 --> 00:11:08,476 Speaker 2: ATREUS project, we then can take the next steps. Right, 219 00:11:08,596 --> 00:11:12,636 Speaker 2: So that's what we did. It was a scientific kind 220 00:11:12,636 --> 00:11:15,636 Speaker 2: of breakthrough. But the machine could only walk and run. 221 00:11:15,956 --> 00:11:20,236 Speaker 2: It couldn't stand, it couldn't turn. You know, it breaks often. 222 00:11:20,596 --> 00:11:23,156 Speaker 2: If it ever fell, it would just be completely destroyed. 223 00:11:23,316 --> 00:11:26,236 Speaker 2: So it's not a productive, useful machine. It's a science demonstrator. 224 00:11:27,596 --> 00:11:33,196 Speaker 1: Okay, So you have a TRIOS that is academically it 225 00:11:33,356 --> 00:11:37,036 Speaker 1: kind of intellectually a breakthrough, right, but nobody's going to 226 00:11:37,116 --> 00:11:40,356 Speaker 1: buy it to do anything. It's not useful in a 227 00:11:40,516 --> 00:11:44,076 Speaker 1: practical sense. What do you do next? 228 00:11:44,436 --> 00:11:46,996 Speaker 2: Okay? So at that point we say, hey, you know, 229 00:11:47,316 --> 00:11:51,276 Speaker 2: we understand a really significant portion of what does it 230 00:11:51,316 --> 00:11:53,516 Speaker 2: take to make a robot that can go where people go? 231 00:11:54,436 --> 00:11:56,756 Speaker 2: And what an opportunity? This is? What are all the 232 00:11:56,796 --> 00:11:58,316 Speaker 2: things that robots are going to be able to do 233 00:11:58,756 --> 00:12:03,356 Speaker 2: when they can be coexisting with humans? Right? And the 234 00:12:03,516 --> 00:12:06,436 Speaker 2: barrier to moving forward on that now is more about 235 00:12:06,676 --> 00:12:11,036 Speaker 2: execution on engineering and execution on building a use case 236 00:12:11,036 --> 00:12:12,516 Speaker 2: in a business around it. 237 00:12:12,596 --> 00:12:15,196 Speaker 1: Because you feel like you've solved the sort of core 238 00:12:15,356 --> 00:12:19,116 Speaker 1: threshold technical problem of a robot that can walk in 239 00:12:19,276 --> 00:12:21,276 Speaker 1: unfamiliar environments and not fall down. 240 00:12:21,476 --> 00:12:25,316 Speaker 2: Correct, we have the foundation layer of leged locomotion. Now, 241 00:12:25,396 --> 00:12:27,676 Speaker 2: sure we'd like to add on perception to it so 242 00:12:27,676 --> 00:12:30,036 Speaker 2: that it can handle stairs effectively and things like that 243 00:12:30,116 --> 00:12:33,516 Speaker 2: and intentionally handle obstacles. We need to do the engineering 244 00:12:33,556 --> 00:12:35,796 Speaker 2: of the electrical system and the hardware system that still 245 00:12:35,796 --> 00:12:38,716 Speaker 2: captures the same physics but can take a beating, can stand, 246 00:12:38,796 --> 00:12:41,076 Speaker 2: can steer, can start to do things right. So we 247 00:12:41,156 --> 00:12:43,236 Speaker 2: know we're going to need to build a robot with legs, 248 00:12:43,236 --> 00:12:46,996 Speaker 2: with manipulators, with sensors. Okay, because so we're kind of 249 00:12:47,036 --> 00:12:49,956 Speaker 2: going down a path now where we want to take 250 00:12:49,996 --> 00:12:54,116 Speaker 2: the exact same first principles approach to how do we 251 00:12:54,156 --> 00:12:56,716 Speaker 2: build a machine that can manipulate things in a human 252 00:12:56,756 --> 00:13:00,276 Speaker 2: world and get around in it and interact with people. 253 00:13:00,316 --> 00:13:03,436 Speaker 2: Build a human centric machine. So the first step to 254 00:13:03,476 --> 00:13:06,276 Speaker 2: doing that was designing a robot that we could sell 255 00:13:06,316 --> 00:13:09,116 Speaker 2: to other researchers to continue the work on the leg 256 00:13:09,156 --> 00:13:12,516 Speaker 2: leggs as we then worked towards you know, arms and manipulators, 257 00:13:12,556 --> 00:13:15,996 Speaker 2: and that was Cassie, our first robot that Agilia Robotics sold. 258 00:13:17,476 --> 00:13:20,676 Speaker 2: Cassie added the ability to stand in place because it 259 00:13:20,716 --> 00:13:23,356 Speaker 2: had ankles, and it had the ability to steer because 260 00:13:23,396 --> 00:13:26,556 Speaker 2: you could turn the legs. But more importantly, it was 261 00:13:26,836 --> 00:13:30,716 Speaker 2: much more compact and extremely robust, so this robot can 262 00:13:30,836 --> 00:13:32,996 Speaker 2: fall hard on concrete and you just pick it back 263 00:13:33,076 --> 00:13:34,596 Speaker 2: up and it can get going again. 264 00:13:34,956 --> 00:13:37,436 Speaker 1: And so, Cassie, I'm looking at a picture of it now. 265 00:13:37,796 --> 00:13:42,316 Speaker 1: It basically looks like a pair of Ostrich legs. Yeah, 266 00:13:42,316 --> 00:13:44,796 Speaker 1: it looks like an Ostrich. Does an Ostrich have a wasist? 267 00:13:45,116 --> 00:13:47,596 Speaker 1: I don't know an ostrich from the waist down, No. 268 00:13:47,596 --> 00:13:51,476 Speaker 2: It doesn't know. Yeah, the pelvis is stationary and a 269 00:13:51,476 --> 00:13:54,116 Speaker 2: bird fixed rather than a human pelvis, which is mobile. 270 00:13:54,476 --> 00:13:57,756 Speaker 1: So I know it's not technically correct to say that 271 00:13:57,796 --> 00:14:00,756 Speaker 1: an Ostrich legs bend backwards, but it looks that way, right. 272 00:14:00,836 --> 00:14:03,396 Speaker 2: Yeah, Their thigh is very short, their knee is up 273 00:14:03,436 --> 00:14:05,676 Speaker 2: next to their body, and what you perceive as their 274 00:14:05,756 --> 00:14:06,956 Speaker 2: knee is actually their ankle. 275 00:14:07,236 --> 00:14:11,636 Speaker 1: In designing this robot, how do you get to legs 276 00:14:11,716 --> 00:14:15,316 Speaker 1: that look like Ostrich legs? Like, it's like convergent evolution, 277 00:14:15,516 --> 00:14:18,636 Speaker 1: you know, convergent evolution maybe hopefully. Isn't it the case 278 00:14:18,676 --> 00:14:23,036 Speaker 1: that like cephalopods that like whatever squids have eyes like 279 00:14:23,076 --> 00:14:26,076 Speaker 1: our eyes, but they evolve totally independently. Is this like that? 280 00:14:26,516 --> 00:14:29,036 Speaker 2: There's a lot of examples of convergent evolution. We can 281 00:14:29,076 --> 00:14:31,876 Speaker 2: only guess, right because we don't necessarily know, and a 282 00:14:31,916 --> 00:14:35,276 Speaker 2: scientists only hypothesize the evolutionary pressures that cause animals to 283 00:14:35,316 --> 00:14:36,076 Speaker 2: be the shape that they are. 284 00:14:36,316 --> 00:14:38,876 Speaker 1: But the pressure that led you you didn't say let's 285 00:14:38,916 --> 00:14:41,116 Speaker 1: make legs that look like Ostrich eggs. You just did 286 00:14:41,156 --> 00:14:42,876 Speaker 1: a bunch of math and you wound up with legs 287 00:14:42,876 --> 00:14:44,156 Speaker 1: that look like Ostrich legs. 288 00:14:44,156 --> 00:14:46,316 Speaker 2: That is correct, and there are a bunch of features 289 00:14:46,356 --> 00:14:48,836 Speaker 2: on our robot that have gone down a similar path. 290 00:14:49,036 --> 00:14:51,476 Speaker 2: And I actually love that because when we end up 291 00:14:52,636 --> 00:14:55,676 Speaker 2: you know, blank sheet, pursue all of the different configuration 292 00:14:55,716 --> 00:14:57,516 Speaker 2: options and say okay, here's what we think is the 293 00:14:57,516 --> 00:14:59,996 Speaker 2: optimal choice, and we say, wow, that looks just like 294 00:15:00,036 --> 00:15:01,956 Speaker 2: a person, or that just looks like a bird or something. 295 00:15:02,436 --> 00:15:04,556 Speaker 2: It's actually really good. It means we're probably on the 296 00:15:04,596 --> 00:15:05,076 Speaker 2: right path. 297 00:15:05,516 --> 00:15:09,076 Speaker 1: Yeah, it's it's exciting in a way, right, like like 298 00:15:09,196 --> 00:15:11,276 Speaker 1: you do a bunch of bath and then suddenly you 299 00:15:11,316 --> 00:15:12,876 Speaker 1: look up and you see an Ostrich. 300 00:15:13,156 --> 00:15:15,116 Speaker 2: But it won't always be that way, right because we're 301 00:15:15,156 --> 00:15:18,756 Speaker 2: not using muscle and bone. We're using aluminum and you know, electricity, 302 00:15:18,796 --> 00:15:19,596 Speaker 2: it's a whole different thing. 303 00:15:19,636 --> 00:15:21,956 Speaker 1: In a way, it's surprising that it is right, like 304 00:15:21,996 --> 00:15:24,276 Speaker 1: you would expect that it wouldn't look at all. 305 00:15:24,196 --> 00:15:26,316 Speaker 2: Familiar, but there are clear differences. 306 00:15:26,596 --> 00:15:29,676 Speaker 1: I suppose I'm projecting this. It's like, whatever is the 307 00:15:29,716 --> 00:15:34,356 Speaker 1: Ostrich version of anthropomorphizing, right, I'm Ostrich pomorphizing? 308 00:15:34,596 --> 00:15:36,316 Speaker 2: Yeah, you got it. And like you said, it's like 309 00:15:36,356 --> 00:15:38,436 Speaker 2: a cartoon version of an Ostrich leg maybe. 310 00:15:38,556 --> 00:15:42,436 Speaker 1: Yeah. Okay, So you've got this robot that is looks 311 00:15:42,476 --> 00:15:44,596 Speaker 1: to my little brain like a pair of Ostrich legs. 312 00:15:44,636 --> 00:15:47,036 Speaker 1: It's just a coincidence because it just turns out to 313 00:15:47,036 --> 00:15:49,036 Speaker 1: be the best way to build a couple legs. And 314 00:15:49,076 --> 00:15:52,116 Speaker 1: do you sell it to other academics? What do you 315 00:15:52,116 --> 00:15:52,516 Speaker 1: do it? 316 00:15:52,756 --> 00:15:54,716 Speaker 2: Yeah? We sold it to some of the top universities 317 00:15:54,716 --> 00:15:56,156 Speaker 2: in the country and the kind in the world. 318 00:15:59,396 --> 00:16:02,436 Speaker 1: So Cassie is a robot that academics can experiment with 319 00:16:02,636 --> 00:16:05,756 Speaker 1: and learn from. But it doesn't have arms. It isn't 320 00:16:05,756 --> 00:16:09,556 Speaker 1: built to do useful tasks. In a minute, Nathan and 321 00:16:09,636 --> 00:16:12,316 Speaker 1: his colleagues build a robot that does have farms, that 322 00:16:12,476 --> 00:16:16,316 Speaker 1: can do useful tasks, and that companies like Amazon are 323 00:16:16,356 --> 00:16:30,116 Speaker 1: testing out in the real world right now. The latest 324 00:16:30,196 --> 00:16:34,036 Speaker 1: robot from Jonathan's company is called Digit. It has two legs, 325 00:16:34,116 --> 00:16:37,316 Speaker 1: two arms, It walks around, it picks stuff up, and 326 00:16:37,396 --> 00:16:40,436 Speaker 1: it looks kind of like a person. But Jonathan says 327 00:16:40,556 --> 00:16:43,156 Speaker 1: he and his colleagues didn't set out to build a 328 00:16:43,236 --> 00:16:44,516 Speaker 1: robot that looks like a person. 329 00:16:44,916 --> 00:16:47,436 Speaker 2: So yeah, so like, I'll take you down one of 330 00:16:47,476 --> 00:16:50,476 Speaker 2: these thought processes that ended up looking like a person. Okay, 331 00:16:51,636 --> 00:16:54,796 Speaker 2: We said, okay, we need to do some sort of 332 00:16:54,796 --> 00:16:58,556 Speaker 2: inertial control of this thing because the robot can't turn 333 00:16:58,716 --> 00:17:01,036 Speaker 2: very well. It's got little feet, and so when you 334 00:17:01,076 --> 00:17:04,196 Speaker 2: try to turn aggressively, its skids right, okay, and if 335 00:17:04,196 --> 00:17:06,836 Speaker 2: you look at any other bipads, this is one of 336 00:17:06,836 --> 00:17:09,356 Speaker 2: the reasons wings evolved. It's because they're running in the stick, 337 00:17:09,396 --> 00:17:11,676 Speaker 2: going out to catch the air to help them in 338 00:17:11,836 --> 00:17:14,636 Speaker 2: maneuvering and turn it. You can go in a straight line, 339 00:17:14,876 --> 00:17:17,396 Speaker 2: but figuring out how to maneuver quickly as hard when 340 00:17:17,396 --> 00:17:19,236 Speaker 2: you've only got a little foot on the ground. You know, 341 00:17:19,676 --> 00:17:22,116 Speaker 2: udropeds can really plant all four feet and twist and 342 00:17:22,156 --> 00:17:24,676 Speaker 2: apply big torchs on the ground, and a biped not 343 00:17:24,716 --> 00:17:26,396 Speaker 2: so much. You've got one foot at a time. How 344 00:17:26,396 --> 00:17:27,756 Speaker 2: do you change your orientation? 345 00:17:27,916 --> 00:17:28,116 Speaker 1: Right? 346 00:17:28,516 --> 00:17:31,036 Speaker 2: So we looked at like putting a gyro on board 347 00:17:31,396 --> 00:17:35,076 Speaker 2: reaction wheels or tails or things like that, because you know, 348 00:17:35,116 --> 00:17:37,076 Speaker 2: we ruled out reaction wheels because that's just a big 349 00:17:37,116 --> 00:17:39,036 Speaker 2: thing of brass that has to be mass you don't 350 00:17:39,036 --> 00:17:41,756 Speaker 2: want in a robot like this. We thought about tails, 351 00:17:41,916 --> 00:17:45,596 Speaker 2: and you look at any animals with tails, bipeds in particular. 352 00:17:46,116 --> 00:17:48,876 Speaker 2: Typically that's to control the pitch. In other words, you'd 353 00:17:48,956 --> 00:17:51,476 Speaker 2: leap off the ground, and then you want to reorient 354 00:17:51,556 --> 00:17:53,956 Speaker 2: your body and your feet so that your feet land 355 00:17:54,196 --> 00:17:57,116 Speaker 2: forward rather than you know, just tumbling in the air. Okay, 356 00:17:57,636 --> 00:17:59,276 Speaker 2: but we don't want to control pitch. We want to 357 00:17:59,316 --> 00:18:02,116 Speaker 2: control y'all. We want to steer the robot. So what 358 00:18:02,156 --> 00:18:04,476 Speaker 2: we kind of came up with is that the best 359 00:18:04,516 --> 00:18:07,796 Speaker 2: way to do that would be a pair of tails 360 00:18:07,836 --> 00:18:10,556 Speaker 2: that are symmetrical on the front of the back or 361 00:18:10,596 --> 00:18:12,276 Speaker 2: the side of the side of the robot, so that 362 00:18:12,316 --> 00:18:14,916 Speaker 2: when you swing those tails, you're controlling exactly down the 363 00:18:14,916 --> 00:18:18,996 Speaker 2: center line of your yaw, and so that just happens 364 00:18:19,036 --> 00:18:21,116 Speaker 2: to be where our arms and shoulders are. You know, 365 00:18:21,676 --> 00:18:24,996 Speaker 2: this bilaterally symmetrical pair of tails that can inertially actuate 366 00:18:25,076 --> 00:18:27,076 Speaker 2: you around the center and allow you to steer. 367 00:18:27,836 --> 00:18:31,756 Speaker 1: So that's the sort of intellectually elegant version of how 368 00:18:31,796 --> 00:18:35,796 Speaker 1: you get to a robot that looks like it has arms. 369 00:18:35,956 --> 00:18:37,596 Speaker 1: You're saying, in fact, in terms of the way you 370 00:18:37,636 --> 00:18:40,156 Speaker 1: thought of it, it's a pair of tails that happened 371 00:18:40,156 --> 00:18:43,796 Speaker 1: to sit where our arms sit. I mean, presumably there's 372 00:18:43,796 --> 00:18:45,676 Speaker 1: a simpler version, which is, you want to build a 373 00:18:45,756 --> 00:18:47,796 Speaker 1: robot that can do stuff in a world that is 374 00:18:47,836 --> 00:18:50,756 Speaker 1: built for humans, and having arms would be useful in 375 00:18:50,796 --> 00:18:52,996 Speaker 1: that context as well. Or no, was it true? 376 00:18:53,036 --> 00:18:55,516 Speaker 2: Member? Yeah, what we're not trying to do is make 377 00:18:55,556 --> 00:18:57,476 Speaker 2: a humanoid robot that looks like a person. What we're 378 00:18:57,476 --> 00:19:00,516 Speaker 2: trying to do is on first principles, understand exactly why 379 00:19:00,516 --> 00:19:01,196 Speaker 2: we do each thing. 380 00:19:01,516 --> 00:19:03,676 Speaker 1: But are you really just adding the arm so that 381 00:19:03,756 --> 00:19:05,876 Speaker 1: it can turn when it's walking, Like I feel like 382 00:19:05,956 --> 00:19:07,636 Speaker 1: that's what you're saying, and I'm skeptical. 383 00:19:08,156 --> 00:19:11,196 Speaker 2: Yeah. It's also that there were three other reasons why 384 00:19:11,236 --> 00:19:13,916 Speaker 2: the arms should be there that were all aligned. They 385 00:19:13,916 --> 00:19:16,716 Speaker 2: were not compromises where you know, putting the arms here 386 00:19:16,796 --> 00:19:18,396 Speaker 2: is better for one thing or another. 387 00:19:18,676 --> 00:19:18,836 Speaker 1: Huh. 388 00:19:19,076 --> 00:19:21,356 Speaker 2: It's that. Hey, if you go on just the just 389 00:19:21,476 --> 00:19:24,716 Speaker 2: the path of I want to improve my locomotion capability, 390 00:19:25,076 --> 00:19:26,996 Speaker 2: you land at the solution of where the arms are. 391 00:19:27,356 --> 00:19:29,476 Speaker 2: If you go down the path of this robot's going 392 00:19:29,516 --> 00:19:32,236 Speaker 2: to fall, and we know that it can't just fall 393 00:19:32,276 --> 00:19:34,796 Speaker 2: and land on its torso it'll break things. How do 394 00:19:34,876 --> 00:19:37,516 Speaker 2: we put manipulator's arms something on it so that it's 395 00:19:37,516 --> 00:19:38,956 Speaker 2: going to be able to catch itself when. 396 00:19:38,876 --> 00:19:41,196 Speaker 1: It falls oriented? Okay. 397 00:19:41,676 --> 00:19:43,836 Speaker 2: And then the third one, of course, is picking up 398 00:19:43,876 --> 00:19:46,636 Speaker 2: things right manipulation in the world and being by manual 399 00:19:46,676 --> 00:19:49,796 Speaker 2: in your manipulation so that you can basically a giant 400 00:19:49,836 --> 00:19:51,916 Speaker 2: pincher grass. That's how you pick up boxes and tots 401 00:19:51,916 --> 00:19:53,916 Speaker 2: and all these things you want to move. That's also 402 00:19:53,956 --> 00:19:57,476 Speaker 2: the best place for them. So basically, you just set 403 00:19:57,516 --> 00:20:00,596 Speaker 2: out to build a machine that could go where humans 404 00:20:00,636 --> 00:20:03,316 Speaker 2: go and pick up things that are the size that 405 00:20:03,436 --> 00:20:07,436 Speaker 2: humans pick up, and from first principles, with your eyes closed, 406 00:20:07,436 --> 00:20:09,916 Speaker 2: you wound up with a thing that looks like a guy. 407 00:20:10,556 --> 00:20:13,476 Speaker 1: Absolutely so okay, so this is how you get to 408 00:20:13,876 --> 00:20:16,996 Speaker 1: digit the robot that you're now building and selling to people. 409 00:20:17,196 --> 00:20:20,516 Speaker 2: That's right. So now we're taking this transition right now 410 00:20:20,516 --> 00:20:23,716 Speaker 2: as a company from that very intellectual and first principles 411 00:20:23,716 --> 00:20:26,596 Speaker 2: approach that I shared with you to now working with 412 00:20:26,596 --> 00:20:29,756 Speaker 2: the customer understanding what their use case is, writing down 413 00:20:29,796 --> 00:20:33,356 Speaker 2: the sets of requirements, like you know, the temperature ranges, 414 00:20:33,476 --> 00:20:35,716 Speaker 2: the you know, weights of all the things that you're 415 00:20:35,756 --> 00:20:39,156 Speaker 2: going to be able to pick up, the safety requirements, 416 00:20:39,196 --> 00:20:41,916 Speaker 2: you know. And it's a massive list, hundreds of things 417 00:20:41,916 --> 00:20:44,436 Speaker 2: in a list to write down the requirements documents so 418 00:20:44,476 --> 00:20:47,156 Speaker 2: that we can engineer a system that is a product. Yeah, 419 00:20:47,236 --> 00:20:50,036 Speaker 2: very different from designing a robot that can do cool things. 420 00:20:50,356 --> 00:20:53,116 Speaker 2: We're engineering a product, and that's the pivot that our 421 00:20:53,156 --> 00:20:54,316 Speaker 2: company is in right the second. 422 00:20:54,356 --> 00:20:56,876 Speaker 1: And like I imagine for you personally, that must be 423 00:20:56,916 --> 00:20:59,636 Speaker 1: a significant shift, right if you spent whatever twenty some 424 00:20:59,836 --> 00:21:04,876 Speaker 1: years in the kind of abstract academic world of like 425 00:21:06,076 --> 00:21:09,876 Speaker 1: let's build the thing that works and know think deep 426 00:21:09,916 --> 00:21:14,076 Speaker 1: thoughts to like let's mass produce a product that people 427 00:21:14,076 --> 00:21:15,516 Speaker 1: will pay us for. That's quite different. 428 00:21:15,836 --> 00:21:18,436 Speaker 2: Oh, it's fundamentally different. It's a whole different way of thinking. 429 00:21:19,196 --> 00:21:23,676 Speaker 2: In fact, I changed my title to chief robot Officer, right, 430 00:21:23,876 --> 00:21:27,396 Speaker 2: and we hired Melanie Wise as our new chief technology officer. 431 00:21:27,756 --> 00:21:30,316 Speaker 2: She comes out of FET Robotics. She was the founder 432 00:21:30,356 --> 00:21:33,116 Speaker 2: there and recently sold that company and they were deploying 433 00:21:33,156 --> 00:21:36,156 Speaker 2: thousands of robots in logistics squarehouses. And she is an 434 00:21:36,196 --> 00:21:40,716 Speaker 2: absolute expert on understanding customers and product and creating a product. 435 00:21:40,956 --> 00:21:43,156 Speaker 2: And what we've done is we've shifted our organization. So 436 00:21:43,596 --> 00:21:46,556 Speaker 2: you know, Melanie is in charge span of that whole 437 00:21:46,556 --> 00:21:49,236 Speaker 2: product side of the organization and the engineering to make 438 00:21:49,276 --> 00:21:52,116 Speaker 2: a product. I'm now leading the innovation team. 439 00:21:52,596 --> 00:21:55,076 Speaker 1: So you get to keep doing kind of the stuff 440 00:21:55,116 --> 00:21:56,036 Speaker 1: you've been doing. 441 00:21:56,236 --> 00:21:57,156 Speaker 2: The things I'm good at. 442 00:21:57,356 --> 00:22:00,556 Speaker 1: Yeah, so what is the frontier on the innovation side. 443 00:22:00,556 --> 00:22:02,076 Speaker 1: What are you trying to figure out next? 444 00:22:02,676 --> 00:22:07,556 Speaker 2: It is fundamentally hardware that enables the kinds of physics 445 00:22:07,556 --> 00:22:10,316 Speaker 2: that we want to achieve, powered by some of these 446 00:22:10,356 --> 00:22:13,196 Speaker 2: new AI tools. You know, we're getting to a point 447 00:22:13,196 --> 00:22:16,796 Speaker 2: now where some of these tools will allow us to 448 00:22:16,876 --> 00:22:20,516 Speaker 2: create behaviors and create things that as an engineer we 449 00:22:20,556 --> 00:22:24,396 Speaker 2: don't know how to model. Huh, And that's super interesting. 450 00:22:24,596 --> 00:22:27,196 Speaker 2: So instead you're describing the symptoms of it, and then 451 00:22:27,196 --> 00:22:30,476 Speaker 2: the system, the learning system, figures out how to make 452 00:22:30,516 --> 00:22:31,036 Speaker 2: that happen. 453 00:22:31,196 --> 00:22:34,436 Speaker 1: Amazingly different than what you've been describing. You've been describing 454 00:22:34,436 --> 00:22:37,636 Speaker 1: of like, let's think of you know, first principles, just 455 00:22:37,796 --> 00:22:41,196 Speaker 1: the physics of the universe, and from that build a machine. 456 00:22:41,596 --> 00:22:44,916 Speaker 1: And now you're talking about you know, an era when 457 00:22:45,036 --> 00:22:47,596 Speaker 1: possibly you'll be able to ignore all of that and 458 00:22:47,636 --> 00:22:49,676 Speaker 1: say to the AI, you figure it out, here's what 459 00:22:49,756 --> 00:22:50,356 Speaker 1: I want to do. 460 00:22:50,756 --> 00:22:52,916 Speaker 2: Well, let me put some caveats on that. 461 00:22:52,996 --> 00:22:54,916 Speaker 1: Yeah, that sounds ridiculous when I say it that way. 462 00:22:55,036 --> 00:22:56,836 Speaker 2: Well, remember that the AI has to operate on a 463 00:22:56,836 --> 00:22:59,436 Speaker 2: piece of hardware. Yeah right, And so that piece of 464 00:22:59,436 --> 00:23:02,796 Speaker 2: hardware we still have to engineer and design to be 465 00:23:02,996 --> 00:23:05,276 Speaker 2: able to achieve the physics that we want to achieve. 466 00:23:05,556 --> 00:23:07,636 Speaker 1: Though you could say to an AI, here's what we 467 00:23:07,676 --> 00:23:10,356 Speaker 1: want to do. What should the hardware look like? Maybe 468 00:23:10,396 --> 00:23:13,116 Speaker 1: in one hundred years you think one hundred, one hundred, 469 00:23:13,116 --> 00:23:13,916 Speaker 1: who knows one hundred. 470 00:23:14,516 --> 00:23:18,316 Speaker 2: It's fine, fine, three years, you know, future, Not today. 471 00:23:18,076 --> 00:23:21,316 Speaker 1: Not certainly not. So what specifically are you doing today? 472 00:23:21,316 --> 00:23:23,356 Speaker 1: Are you taking this robot that you have digit and 473 00:23:23,396 --> 00:23:26,516 Speaker 1: seeing if you sort of put an AI layer in it, 474 00:23:26,596 --> 00:23:27,876 Speaker 1: on it near it? What can you do? 475 00:23:27,956 --> 00:23:30,596 Speaker 2: Is that what's happening all of the above, So, you know, 476 00:23:30,636 --> 00:23:32,156 Speaker 2: on the hardware side, So there's a lot on the 477 00:23:32,156 --> 00:23:34,796 Speaker 2: hardware you do just to make it even possible for 478 00:23:34,836 --> 00:23:38,116 Speaker 2: the AI to learn. We're building then a whole architecture 479 00:23:38,156 --> 00:23:40,236 Speaker 2: of a digital twin so that you can learn things 480 00:23:40,276 --> 00:23:42,876 Speaker 2: in simulation first, and then you know, transfer from sim 481 00:23:42,916 --> 00:23:43,196 Speaker 2: to reel. 482 00:23:43,796 --> 00:23:46,556 Speaker 1: A digital twin is basically making a version of the 483 00:23:46,636 --> 00:23:49,716 Speaker 1: robot that exists as software that exists virtually. 484 00:23:49,476 --> 00:23:52,596 Speaker 2: Version of the environment as well that the robot operates in, 485 00:23:52,796 --> 00:23:55,836 Speaker 2: so that everything can be done, you know, decades of 486 00:23:55,876 --> 00:23:58,476 Speaker 2: experience on the robot can be done in hours of 487 00:23:58,516 --> 00:24:00,316 Speaker 2: time through the you know, et cetera. 488 00:24:00,796 --> 00:24:03,156 Speaker 1: So the digital twin is allowing you to try and 489 00:24:03,236 --> 00:24:06,396 Speaker 1: generate data to train the AI. Is that that's what's the. 490 00:24:06,316 --> 00:24:08,676 Speaker 2: Source of the data, right, A lot of language models 491 00:24:08,676 --> 00:24:11,316 Speaker 2: and things like that are based on data from the internet. Well, 492 00:24:11,756 --> 00:24:14,396 Speaker 2: I don't think that that's feasible for robot control because 493 00:24:14,436 --> 00:24:16,556 Speaker 2: the physics of the hardware is so unique. So even 494 00:24:16,596 --> 00:24:19,196 Speaker 2: if you're trying to teleoperate this thing, you've got this 495 00:24:19,276 --> 00:24:22,516 Speaker 2: weird translation between what a person would do to then 496 00:24:22,556 --> 00:24:25,076 Speaker 2: the robot trying to mimic that, which is probably not 497 00:24:25,236 --> 00:24:25,956 Speaker 2: the dime of the right. 498 00:24:25,876 --> 00:24:28,756 Speaker 1: Dynamic robot doesn't actually walk like a person, even if 499 00:24:28,796 --> 00:24:29,796 Speaker 1: it looks like it's. 500 00:24:29,596 --> 00:24:32,476 Speaker 2: Right, that's right, everything's sort of different internally about it. 501 00:24:32,476 --> 00:24:33,476 Speaker 2: How it would control itself. 502 00:24:34,076 --> 00:24:40,596 Speaker 1: What what what are you worried about? Like you what 503 00:24:40,636 --> 00:24:42,316 Speaker 1: could go wrong and how are you trying to get 504 00:24:42,316 --> 00:24:43,076 Speaker 1: it not to go wrong? 505 00:24:43,156 --> 00:24:45,036 Speaker 2: So before I talk about what I'm worried about, let 506 00:24:45,036 --> 00:24:47,476 Speaker 2: me tell you what I'm excited about. Fair We had 507 00:24:47,516 --> 00:24:49,876 Speaker 2: ten of these Cassie robots out in the world, and 508 00:24:49,956 --> 00:24:51,756 Speaker 2: so researchers all over the place for for years and 509 00:24:51,836 --> 00:24:54,676 Speaker 2: years are working on various kinds of controllers. When we 510 00:24:54,676 --> 00:24:58,196 Speaker 2: were able to successfully get a learned policy working on Cassie, 511 00:24:58,556 --> 00:25:01,316 Speaker 2: were to run a five k across campus, we were 512 00:25:01,356 --> 00:25:04,356 Speaker 2: able to the world record in the one hundred meter dash. 513 00:25:04,476 --> 00:25:05,396 Speaker 2: We were able to. 514 00:25:05,316 --> 00:25:08,636 Speaker 1: Do for a robot. Yes, and when you say learned, 515 00:25:08,636 --> 00:25:11,076 Speaker 1: you mean developed with machine learning as opposed to in 516 00:25:11,076 --> 00:25:12,036 Speaker 1: the old way. Is that what you mean? 517 00:25:12,156 --> 00:25:15,796 Speaker 2: Correct? It's an entirely machine learned policy that was learned 518 00:25:15,796 --> 00:25:18,396 Speaker 2: in simulation and then put on the thing. 519 00:25:18,916 --> 00:25:22,836 Speaker 1: What's a definition of a policy? As you're using the word, I. 520 00:25:22,796 --> 00:25:26,636 Speaker 2: Mean, a policy is just a bunch of math that 521 00:25:27,076 --> 00:25:30,196 Speaker 2: takes as input all of the sensors and then spits 522 00:25:30,236 --> 00:25:33,796 Speaker 2: out numbers that describe the torques you should apply to 523 00:25:33,836 --> 00:25:34,396 Speaker 2: the motors. 524 00:25:34,596 --> 00:25:37,836 Speaker 1: Okay, so it's sort of if this happens in the world, robot, 525 00:25:37,916 --> 00:25:39,636 Speaker 1: you should do this set of things. 526 00:25:40,356 --> 00:25:42,356 Speaker 2: Yeah, you're based on you know, but you know it's 527 00:25:42,396 --> 00:25:45,996 Speaker 2: like the input from thirty encoders or one hundred different sensors, 528 00:25:46,036 --> 00:25:47,356 Speaker 2: all of that complex input. 529 00:25:47,476 --> 00:25:51,316 Speaker 1: But if this is complicated and the then that are complicated, Yeah. 530 00:25:51,196 --> 00:25:53,396 Speaker 2: No, you're rreat you're right about that. Yeah, it's an equation. 531 00:25:53,676 --> 00:25:57,916 Speaker 1: Good. So the machine learning basically made the robot work 532 00:25:57,956 --> 00:25:58,516 Speaker 1: way better. 533 00:25:58,836 --> 00:26:02,596 Speaker 2: It made the robot work much better, and even more importantly, 534 00:26:02,836 --> 00:26:06,116 Speaker 2: once the pipeline was in place, we can learn new 535 00:26:06,116 --> 00:26:09,756 Speaker 2: skills and learn new policies much faster with much less 536 00:26:09,796 --> 00:26:13,356 Speaker 2: engineer time. How we can get there faster and have 537 00:26:13,436 --> 00:26:16,436 Speaker 2: higher performance using learning approaches to control. 538 00:26:16,436 --> 00:26:19,196 Speaker 1: It's like a productivity like supercharger. It just makes everything 539 00:26:19,236 --> 00:26:20,796 Speaker 1: go much faster and more efficiently. 540 00:26:21,036 --> 00:26:25,156 Speaker 2: Absolutely, it's a new tool. It's amazing, that's right. Okay, 541 00:26:25,236 --> 00:26:29,076 Speaker 2: So what am I worried about? Right? What am I 542 00:26:29,156 --> 00:26:32,316 Speaker 2: worried about? I'm worried that this is one of those 543 00:26:32,396 --> 00:26:34,396 Speaker 2: kind of black Swan events. Right, this is one of 544 00:26:34,396 --> 00:26:38,476 Speaker 2: those things that changes everything, and everybody doesn't exactly know 545 00:26:38,556 --> 00:26:40,996 Speaker 2: what all the implications are yet and what are the 546 00:26:41,316 --> 00:26:45,236 Speaker 2: you know, the right paths forward, and so everybody's trying everything. 547 00:26:45,116 --> 00:26:50,396 Speaker 1: And this being basically a useful bipedal robot like it, 548 00:26:50,636 --> 00:26:53,316 Speaker 1: it could be hugely important in ways that we don't understand, 549 00:26:53,356 --> 00:26:55,156 Speaker 1: and there could be unintended consequences. 550 00:26:55,196 --> 00:26:59,756 Speaker 2: What I actually mean is that this new realization that 551 00:26:59,836 --> 00:27:05,956 Speaker 2: we can use learning policies to control dynamic robots and machines, yeah, 552 00:27:06,396 --> 00:27:11,356 Speaker 2: means that the entire way it all robotics controls people 553 00:27:11,396 --> 00:27:16,156 Speaker 2: have been doing robot controls before is not as relevant. 554 00:27:17,076 --> 00:27:20,516 Speaker 2: And this new tool that nobody really understands that well 555 00:27:20,596 --> 00:27:23,836 Speaker 2: yet is clearly the future of how it's going to work. 556 00:27:25,036 --> 00:27:29,676 Speaker 1: So what are I mean? I understand that if it's 557 00:27:29,676 --> 00:27:31,956 Speaker 1: really a black swan, you don't know what's going to happen, 558 00:27:31,956 --> 00:27:33,596 Speaker 1: because if you knew, it wouldn't be a black swan, 559 00:27:33,676 --> 00:27:36,796 Speaker 1: But like, what are you thinking of? What could it mean? 560 00:27:37,276 --> 00:27:41,156 Speaker 1: You know, plainly, bipedal robots are a very powerful tool, 561 00:27:41,236 --> 00:27:46,556 Speaker 1: and you could imagine malevolent uses of them, right, I guess, So, 562 00:27:46,716 --> 00:27:48,956 Speaker 1: I guess we've already got drones. Right, It doesn't matter 563 00:27:48,956 --> 00:27:51,156 Speaker 1: whether the robot that kills you looks like a dude, right, 564 00:27:51,196 --> 00:27:52,276 Speaker 1: those things already exist. 565 00:27:52,516 --> 00:27:54,756 Speaker 2: In fact, a humanoid robot is probably the least effective 566 00:27:54,756 --> 00:27:58,076 Speaker 2: way to do that I think, honestly, my biggest worry 567 00:27:58,116 --> 00:28:00,196 Speaker 2: about AI in general has a lot more to do 568 00:28:00,276 --> 00:28:03,956 Speaker 2: with its ability to influence people, its ability to model 569 00:28:03,996 --> 00:28:06,636 Speaker 2: people's feelings and that kind of thing. 570 00:28:06,836 --> 00:28:09,476 Speaker 1: So that's more like using large language model for kind 571 00:28:09,516 --> 00:28:11,396 Speaker 1: of personalized misinformation or that. 572 00:28:11,436 --> 00:28:15,036 Speaker 2: Stat it that's the biggest threat building robots that are 573 00:28:15,036 --> 00:28:17,956 Speaker 2: going to like take people. I don't know, I just 574 00:28:17,996 --> 00:28:19,436 Speaker 2: don't see it fair. 575 00:28:19,876 --> 00:28:22,156 Speaker 1: I mean, I'm kind of tired of talking about technological 576 00:28:22,236 --> 00:28:25,156 Speaker 1: unemployment because the robot looks like a person. It makes 577 00:28:25,196 --> 00:28:26,836 Speaker 1: me feel like we should touch on it. Do you 578 00:28:26,876 --> 00:28:28,716 Speaker 1: want to just speak for a moment to the prospect 579 00:28:28,756 --> 00:28:30,156 Speaker 1: of technological unemployment? 580 00:28:30,356 --> 00:28:33,316 Speaker 2: Sure, I'll say our entire business model is centered around 581 00:28:33,356 --> 00:28:36,556 Speaker 2: the number of unfilled roles that exist in the logistics environment. 582 00:28:37,276 --> 00:28:39,316 Speaker 2: It's not centered around how it's going to be less 583 00:28:39,316 --> 00:28:44,036 Speaker 2: expensive than human Labor's centered around how they actually in 584 00:28:44,076 --> 00:28:48,156 Speaker 2: geographic locations do not have enough people to provide the 585 00:28:48,196 --> 00:28:51,156 Speaker 2: service that they're providing, you know, the way they're doing 586 00:28:51,156 --> 00:28:55,556 Speaker 2: it now, there's no way forward for improved logistics and 587 00:28:55,636 --> 00:28:58,196 Speaker 2: getting your things in one day and you know, all 588 00:28:58,236 --> 00:29:00,916 Speaker 2: the stuff that people really really want. There's no path 589 00:29:00,956 --> 00:29:03,556 Speaker 2: forward to do that with more human labor doing it, 590 00:29:03,556 --> 00:29:07,756 Speaker 2: it must be automated in a significant way. So you know, 591 00:29:07,796 --> 00:29:10,036 Speaker 2: that's our whole business model is based on unfilled roles. 592 00:29:11,356 --> 00:29:14,916 Speaker 1: What is your like happy version of the future in 593 00:29:14,996 --> 00:29:17,116 Speaker 1: whatever number of years seems like the right number of. 594 00:29:17,996 --> 00:29:21,996 Speaker 2: Man These robots are actually safe and smart enough in 595 00:29:22,076 --> 00:29:23,796 Speaker 2: order to do a lot of useful things in the world. 596 00:29:23,876 --> 00:29:25,836 Speaker 2: And the relationship with people is kind of like a 597 00:29:25,836 --> 00:29:29,036 Speaker 2: service animal. And you know, these robots are everywhere and 598 00:29:29,636 --> 00:29:32,316 Speaker 2: delivering all the packages to your door, and you know, 599 00:29:32,396 --> 00:29:34,916 Speaker 2: being a telepresence device that you can easily log into 600 00:29:34,956 --> 00:29:37,796 Speaker 2: and keep in touch with people, and you know, in 601 00:29:38,196 --> 00:29:41,436 Speaker 2: a lot of warehousing and stocking and you know, doing 602 00:29:41,476 --> 00:29:44,116 Speaker 2: the dull, dirty, dangerous to classic three d's of robotics. 603 00:29:45,436 --> 00:29:49,556 Speaker 2: It's we've always wanted that. It's all about improving the 604 00:29:49,596 --> 00:29:52,756 Speaker 2: quality of life. It's all about letting enabling humans to 605 00:29:52,796 --> 00:29:55,196 Speaker 2: be more human, letting people do the things that are 606 00:29:55,436 --> 00:29:58,116 Speaker 2: that they want to do that involve the social interaction 607 00:29:58,196 --> 00:30:00,716 Speaker 2: and the creativity and the variety that people are so 608 00:30:00,756 --> 00:30:02,956 Speaker 2: good at, and having robots that can pick up all 609 00:30:02,996 --> 00:30:05,956 Speaker 2: of the tasks that we'd rather not do, and having 610 00:30:06,036 --> 00:30:08,996 Speaker 2: robots that can be in environments that are designed around us. 611 00:30:09,716 --> 00:30:12,436 Speaker 2: Is a really important step to being able to achieve 612 00:30:12,476 --> 00:30:13,316 Speaker 2: that in a really great way. 613 00:30:14,436 --> 00:30:15,956 Speaker 1: Is there anything else you want to talk about? 614 00:30:17,396 --> 00:30:22,076 Speaker 2: MM. One thing I want to make sure that we 615 00:30:22,116 --> 00:30:26,796 Speaker 2: get across is kind of the clear argument that the 616 00:30:26,956 --> 00:30:30,516 Speaker 2: fastest path to general purpose machine that does use full 617 00:30:30,516 --> 00:30:33,196 Speaker 2: work in human spaces is to do one thing first 618 00:30:33,436 --> 00:30:35,316 Speaker 2: and then the second thing, and to do it for 619 00:30:35,396 --> 00:30:38,356 Speaker 2: customers and to get it deployed and to figure out 620 00:30:38,396 --> 00:30:41,036 Speaker 2: the reliability and the safety and so on. That's the path. 621 00:30:41,836 --> 00:30:43,556 Speaker 2: There is no good way to just like jump to 622 00:30:43,596 --> 00:30:44,036 Speaker 2: the answer. 623 00:30:44,356 --> 00:30:46,916 Speaker 1: So basically you're saying, you can't just build a robot 624 00:30:46,956 --> 00:30:48,636 Speaker 1: that does everything. You have to build a robot that 625 00:30:48,636 --> 00:30:51,036 Speaker 1: does one thing and then figure out how to make 626 00:30:51,076 --> 00:30:51,796 Speaker 1: it to a second thing. 627 00:30:52,276 --> 00:30:53,836 Speaker 2: Yeah, but you want to stay on that vision. You 628 00:30:53,836 --> 00:30:55,956 Speaker 2: want to have your guess of what the everything robot 629 00:30:55,956 --> 00:30:58,356 Speaker 2: looks like, but you know you're going to be wrong, 630 00:30:58,956 --> 00:31:00,876 Speaker 2: and so then you start on what's the match to 631 00:31:00,916 --> 00:31:02,756 Speaker 2: the first thing that it should be able to do, 632 00:31:03,356 --> 00:31:05,836 Speaker 2: and then you keep on revising and iterating on that path. 633 00:31:06,076 --> 00:31:08,396 Speaker 2: So I'll give an example. This This is through our 634 00:31:08,516 --> 00:31:11,156 Speaker 2: entire est then of building the function first and the 635 00:31:11,156 --> 00:31:13,636 Speaker 2: physics first and the first principles approach to figuring out 636 00:31:13,636 --> 00:31:16,116 Speaker 2: things like the legs. Right, yeah, I want to point out, 637 00:31:16,156 --> 00:31:20,676 Speaker 2: like hands, how do you produce a dextrous manipulator? By dexterous, 638 00:31:20,756 --> 00:31:22,596 Speaker 2: I mean something that can pick up pens and pick 639 00:31:22,676 --> 00:31:25,476 Speaker 2: up objects and do useful things with the hand, open doorknobs, 640 00:31:25,516 --> 00:31:28,396 Speaker 2: all the stuff. Right, I don't care how it looks, 641 00:31:28,596 --> 00:31:32,396 Speaker 2: I care what it does. So what we're doing for 642 00:31:32,436 --> 00:31:35,476 Speaker 2: our first use case is picking up totes that can 643 00:31:35,516 --> 00:31:38,516 Speaker 2: have a twenty kilogram bowling ball rolling around in it. 644 00:31:38,996 --> 00:31:41,276 Speaker 2: That's a very hard thing to pick up. And so 645 00:31:41,396 --> 00:31:43,956 Speaker 2: now our grippers are these big graspers that can grasp 646 00:31:43,996 --> 00:31:46,436 Speaker 2: the side of this tote and pick it up. A 647 00:31:46,476 --> 00:31:48,796 Speaker 2: lot of groups are sort of have these five figured 648 00:31:48,836 --> 00:31:52,556 Speaker 2: hands on their robots which look like a human's hand, 649 00:31:53,156 --> 00:31:55,476 Speaker 2: but certainly couldn't pick up those twenty kilogram totes. 650 00:31:55,676 --> 00:31:59,036 Speaker 1: What do you have just done? Pincer sort of basically yeah. 651 00:31:58,676 --> 00:32:01,236 Speaker 2: You know, effectively it's a big sort of four fingered 652 00:32:01,276 --> 00:32:03,956 Speaker 2: pincher thing that can have leverage on it and grasp 653 00:32:04,036 --> 00:32:04,956 Speaker 2: the sides of the tote. 654 00:32:05,036 --> 00:32:07,436 Speaker 1: I mean does that mean that it can't do sort 655 00:32:07,476 --> 00:32:10,476 Speaker 1: of fine motor things. The trade off that you're making this. 656 00:32:10,596 --> 00:32:12,996 Speaker 2: I would say the current pincher design, Yeah, it's not 657 00:32:13,116 --> 00:32:15,636 Speaker 2: designed to pick up small objects. But you know, we 658 00:32:15,676 --> 00:32:18,996 Speaker 2: see a vision where that's actually a tool, not a hand. 659 00:32:19,516 --> 00:32:22,276 Speaker 2: And just like people use tools, robots are gonna use tools. 660 00:32:22,636 --> 00:32:26,916 Speaker 2: But there's such an opportunity to have the tool be 661 00:32:26,996 --> 00:32:29,556 Speaker 2: attached at the wrist or the fingers or the elbow 662 00:32:29,636 --> 00:32:31,876 Speaker 2: or the forearm or wherever. And we don't know exactly 663 00:32:31,876 --> 00:32:34,636 Speaker 2: how that should be, and nobody does. But one thing 664 00:32:34,636 --> 00:32:38,636 Speaker 2: I can say with confidence is that these five fingered 665 00:32:38,636 --> 00:32:42,356 Speaker 2: hands cannot do dexterous manipulation. It's not just a controls problem. 666 00:32:42,356 --> 00:32:47,116 Speaker 2: They're making the same mistake of creating something that has 667 00:32:47,156 --> 00:32:50,876 Speaker 2: the same morphology. It looks like a person. It looks 668 00:32:50,916 --> 00:32:52,836 Speaker 2: like a person, but that doesn't mean it can apply 669 00:32:52,916 --> 00:32:54,996 Speaker 2: the right forces, or have the right kinematics, or have 670 00:32:55,036 --> 00:32:57,996 Speaker 2: the right dynamics or the right compliance or anything that. 671 00:32:58,076 --> 00:32:59,476 Speaker 2: We don't know how to do that. It's one of 672 00:32:59,476 --> 00:33:02,556 Speaker 2: the grand challenges in robotics. So the fastest path to 673 00:33:02,596 --> 00:33:05,636 Speaker 2: get there is to start manipulating things and do stuff 674 00:33:05,636 --> 00:33:08,836 Speaker 2: that has metrics right measures, a. 675 00:33:09,396 --> 00:33:11,716 Speaker 1: Job that someone is actually willing to pay for that's right. 676 00:33:11,796 --> 00:33:15,156 Speaker 1: So right now it's moving tots around as basically, what's 677 00:33:15,196 --> 00:33:15,996 Speaker 1: the second job? 678 00:33:16,876 --> 00:33:19,916 Speaker 2: Boxes? Cardboard boxes, huh? And then the third job is 679 00:33:19,956 --> 00:33:21,916 Speaker 2: starting we want to start doing each picking, you know, 680 00:33:22,196 --> 00:33:24,116 Speaker 2: picking up things and putting them in the boxes and 681 00:33:24,156 --> 00:33:25,556 Speaker 2: in the So those are much. 682 00:33:25,436 --> 00:33:30,396 Speaker 1: Smaller things, different than picking up a big punt. 683 00:33:30,236 --> 00:33:32,476 Speaker 2: That's right. And so the question is, and nobody knows 684 00:33:32,476 --> 00:33:35,156 Speaker 2: the answer to this, is that two different tools or 685 00:33:35,236 --> 00:33:38,036 Speaker 2: is that one general purpose manipulator that can do both 686 00:33:38,036 --> 00:33:38,476 Speaker 2: of those things. 687 00:33:38,476 --> 00:33:40,196 Speaker 1: They need to put a different hand on the robot 688 00:33:40,196 --> 00:33:42,156 Speaker 1: to pick up a little thing versus pick up a 689 00:33:42,156 --> 00:33:43,756 Speaker 1: big sure, or is it optimal? 690 00:33:44,556 --> 00:33:46,396 Speaker 2: And you know this will be the this gets into 691 00:33:46,756 --> 00:33:51,316 Speaker 2: customer requirements question rather than fundamental science question. Yeah, that's 692 00:33:51,516 --> 00:33:53,676 Speaker 2: your that's for your product person to figure out. Yeah, 693 00:33:53,716 --> 00:33:55,716 Speaker 2: but that's how this is going to evolve. That's how 694 00:33:55,716 --> 00:33:57,956 Speaker 2: we're going to get to dexterous manipulation in the world 695 00:33:58,076 --> 00:33:59,356 Speaker 2: in a way that really is meaningful. 696 00:33:59,396 --> 00:34:01,836 Speaker 1: Investors, one job at a ton, Yeah. 697 00:34:01,756 --> 00:34:05,196 Speaker 2: Figure out if you've got the first like you know, 698 00:34:05,236 --> 00:34:07,596 Speaker 2: the first four or five jobs, and now you've got 699 00:34:07,596 --> 00:34:11,076 Speaker 2: your requirements, yours of requirements and your measurements and your 700 00:34:11,076 --> 00:34:13,236 Speaker 2: metrics and now engineers are going to be able to 701 00:34:13,236 --> 00:34:16,436 Speaker 2: iterate on that really make something work. But just trying 702 00:34:16,436 --> 00:34:17,916 Speaker 2: to copy a five figure in hand and say now 703 00:34:17,956 --> 00:34:20,556 Speaker 2: now it's an AI problem completely false, that's just not 704 00:34:20,676 --> 00:34:21,196 Speaker 2: going to happen. 705 00:34:24,956 --> 00:34:27,196 Speaker 1: We'll be back in a minute with the lightning round. 706 00:34:38,076 --> 00:34:42,556 Speaker 1: Let's finish with the lightning round. I've read that you 707 00:34:42,676 --> 00:34:45,196 Speaker 1: like to jog at night for work, but like, are 708 00:34:45,236 --> 00:34:47,396 Speaker 1: there specific things that have happened to you or that 709 00:34:47,476 --> 00:34:50,756 Speaker 1: you've done to try and sort of you know, put 710 00:34:50,796 --> 00:34:54,396 Speaker 1: yourself in hard walking or running settings and where you've 711 00:34:54,436 --> 00:34:57,076 Speaker 1: actually had a thing happened and thought, oh, we need 712 00:34:57,116 --> 00:34:59,996 Speaker 1: to make sure the robot doesn't fall over when X happens. 713 00:35:00,916 --> 00:35:03,796 Speaker 2: Honestly, it's more like when you're taking a hike, you know, 714 00:35:03,836 --> 00:35:05,316 Speaker 2: and you kind of get into the mode of you're 715 00:35:05,356 --> 00:35:08,596 Speaker 2: just your long day and a long hike and you know, 716 00:35:08,676 --> 00:35:12,836 Speaker 2: watching and being mindful, I guess, and thinking about what 717 00:35:12,876 --> 00:35:16,076 Speaker 2: are my feet doing and how is the contact progressing 718 00:35:16,156 --> 00:35:18,636 Speaker 2: with the ground and how does that feel and why 719 00:35:18,756 --> 00:35:21,796 Speaker 2: is that happening? And sort of daydreaming while you're thinking 720 00:35:21,796 --> 00:35:24,476 Speaker 2: that through is a really good way then to start 721 00:35:24,476 --> 00:35:29,596 Speaker 2: to recognize connections to research papers and connections to things 722 00:35:29,596 --> 00:35:32,196 Speaker 2: that people have found scientifically, and then start to pull 723 00:35:32,236 --> 00:35:35,556 Speaker 2: together hypotheses about how you would implement something like this 724 00:35:35,636 --> 00:35:38,956 Speaker 2: on a robot, or what is necessary because we're human, 725 00:35:39,196 --> 00:35:42,716 Speaker 2: and what is necessary because it's fundamental to look emotion 726 00:35:43,236 --> 00:35:45,876 Speaker 2: like why do we have feet? That's very different from 727 00:35:45,916 --> 00:35:49,596 Speaker 2: bird feet atreus didn't have feet, you know, And then 728 00:35:49,636 --> 00:35:51,916 Speaker 2: what exactly do we get anyway? Thinking through all that 729 00:35:51,996 --> 00:35:54,516 Speaker 2: kind of stuff. 730 00:35:56,836 --> 00:35:58,596 Speaker 1: C three po or R two D two. 731 00:36:01,876 --> 00:36:04,476 Speaker 2: Why that is a tough one. I'm going to say 732 00:36:04,556 --> 00:36:05,316 Speaker 2: R two D two. 733 00:36:05,836 --> 00:36:09,836 Speaker 1: Not the bipedal one, not because that walks on two legs. 734 00:36:10,116 --> 00:36:12,476 Speaker 2: C three, Well, no, I'm going to change my mind. 735 00:36:13,436 --> 00:36:15,036 Speaker 1: I buyased it. Wait, why were you going to say 736 00:36:15,076 --> 00:36:16,396 Speaker 1: or two I was going to say to. 737 00:36:16,596 --> 00:36:19,796 Speaker 2: You because C three pos a protocol droid And if 738 00:36:19,836 --> 00:36:22,676 Speaker 2: it's just about language, why do you have legs? Was 739 00:36:22,716 --> 00:36:23,676 Speaker 2: On the other hand, you. 740 00:36:24,076 --> 00:36:25,756 Speaker 1: Just you don't need a robot at all, right, you 741 00:36:25,836 --> 00:36:27,556 Speaker 1: just need a little phone or whatever. 742 00:36:27,676 --> 00:36:30,756 Speaker 2: On the other hand, right, it's a human centric robot. 743 00:36:30,836 --> 00:36:32,956 Speaker 2: It's a robot that's meant to be a translator and 744 00:36:33,076 --> 00:36:35,196 Speaker 2: be existing with people in the room and so on, 745 00:36:35,436 --> 00:36:38,316 Speaker 2: and so actually making something that's more of a humanoid 746 00:36:38,436 --> 00:36:40,836 Speaker 2: makes more sense if it's basically a social robot. 747 00:36:41,196 --> 00:36:45,316 Speaker 1: Yeah, okay, Uh, what's something you wish that a robot 748 00:36:45,356 --> 00:36:48,196 Speaker 1: could do for you, you know, outside of work. 749 00:36:50,396 --> 00:36:54,716 Speaker 2: Honestly, what I want is for robots to make everything cheaper. 750 00:36:55,196 --> 00:36:58,436 Speaker 1: Uh huh. I love it when technology makes things cheaper. Well, 751 00:36:58,476 --> 00:37:00,716 Speaker 1: and that's what's been happening for the past couple hundred years, 752 00:37:00,756 --> 00:37:04,876 Speaker 1: and all of automation has effectively made everybody richer effectively 753 00:37:04,916 --> 00:37:08,396 Speaker 1: and improved everybody's quality of life because everything is cheaper. 754 00:37:09,196 --> 00:37:12,036 Speaker 1: And I want to be able to book my vacation 755 00:37:12,156 --> 00:37:15,116 Speaker 1: to the moon. I want to be able to not 756 00:37:15,196 --> 00:37:17,716 Speaker 1: really worry about you know, as it is, I don't 757 00:37:17,716 --> 00:37:20,316 Speaker 1: really worry about how much my phone costs or whatever. 758 00:37:20,396 --> 00:37:23,396 Speaker 1: If it breaks, it's not an expensive phone, it's fine, 759 00:37:23,436 --> 00:37:25,436 Speaker 1: it works, and all my software to transfers over. I 760 00:37:25,436 --> 00:37:28,196 Speaker 1: want everything to be like that. I want my lifestyle 761 00:37:28,276 --> 00:37:32,556 Speaker 1: to be supported by things that don't really cost a lot. Yes, well, 762 00:37:32,596 --> 00:37:36,036 Speaker 1: and I mean in the very long run, the sort 763 00:37:36,036 --> 00:37:40,956 Speaker 1: of technology driven productivity gains lift people out of poverty. Right, 764 00:37:40,996 --> 00:37:43,596 Speaker 1: everybody cares about our phones, but there are a lot 765 00:37:43,636 --> 00:37:45,676 Speaker 1: of people who have three dollars a day right now, 766 00:37:45,716 --> 00:37:46,996 Speaker 1: and it would be great if they could get to 767 00:37:47,076 --> 00:37:48,356 Speaker 1: three hundred dollars a day. 768 00:37:48,436 --> 00:37:50,876 Speaker 2: Or beyond that. It's just that, you know, all the 769 00:37:50,916 --> 00:37:54,716 Speaker 2: things that we need become very easy to acquire. Food 770 00:37:54,796 --> 00:37:57,876 Speaker 2: is no longer, labor on farms is done in an 771 00:37:57,876 --> 00:38:01,116 Speaker 2: automated way, Labor in terms of logistics and transporting things 772 00:38:01,116 --> 00:38:03,156 Speaker 2: has done in an automated way, and so all of 773 00:38:03,156 --> 00:38:05,796 Speaker 2: this stuff becomes so affordable that it's easy to uplift 774 00:38:05,796 --> 00:38:08,876 Speaker 2: the quality of life of everybody on earth. That is 775 00:38:08,916 --> 00:38:10,076 Speaker 2: what a lot from Robotics. 776 00:38:15,956 --> 00:38:19,116 Speaker 1: Jonathan Hurst is the co founder and chief robot officer 777 00:38:19,236 --> 00:38:24,716 Speaker 1: at Agility Robotics. Today's show was produced by Gabriel Hunter Chang. 778 00:38:24,996 --> 00:38:28,316 Speaker 1: It was edited by Lydia Jane Kott and engineered by 779 00:38:28,356 --> 00:38:31,956 Speaker 1: Sarah Bruguer. You can email us at problem at Pushkin 780 00:38:32,036 --> 00:38:35,236 Speaker 1: dot FM. I'm Jacob Goldstein, and we'll be back next 781 00:38:35,236 --> 00:38:45,556 Speaker 1: week with another episode of What's Your Problem