1 00:00:15,356 --> 00:00:23,396 Speaker 1: Pushkin As a general matter, I'm a fan of technological progress, 2 00:00:23,916 --> 00:00:27,116 Speaker 1: but I'll admit that humanoid robots kind of creep me 3 00:00:27,156 --> 00:00:32,156 Speaker 1: out for you know, obvious, normy, uncanny Valley type reasons. 4 00:00:32,956 --> 00:00:36,316 Speaker 1: And yet there is an exchange that you'll hear near 5 00:00:36,396 --> 00:00:40,116 Speaker 1: the end of today's show. That's the most compelling argument 6 00:00:40,196 --> 00:00:43,916 Speaker 1: I've ever heard for humanoid robots. And it's not just 7 00:00:43,956 --> 00:00:48,836 Speaker 1: an intellectual argument, it's an emotional argument. If that's a phrase, 8 00:00:49,036 --> 00:00:59,676 Speaker 1: it's really a very human argument for humanoid robots. I'm 9 00:00:59,756 --> 00:01:02,116 Speaker 1: Jacob Goldstein, and this is what's your problem to show 10 00:01:02,156 --> 00:01:04,156 Speaker 1: where I talk to people who are trying to make 11 00:01:04,236 --> 00:01:08,636 Speaker 1: technological progress. My guest today is Jeff Cardinas. He's the 12 00:01:08,756 --> 00:01:12,756 Speaker 1: k founder and CEO of Eptronic. Jeff's problem is this, 13 00:01:13,636 --> 00:01:17,636 Speaker 1: can you make a safe, reliable humanoid robot for less 14 00:01:17,636 --> 00:01:21,636 Speaker 1: than fifty thousand dollars. We started our conversation talking about 15 00:01:21,756 --> 00:01:25,996 Speaker 1: the DARPA Robotics Challenge. DARPA, of course, is the government 16 00:01:26,036 --> 00:01:28,876 Speaker 1: agency that helped to create the Internet and that gave 17 00:01:28,916 --> 00:01:31,876 Speaker 1: a big push to early self driving cars, among other things. 18 00:01:32,316 --> 00:01:36,196 Speaker 1: And Jeff says the agency's robotics challenge, which happened a 19 00:01:36,236 --> 00:01:39,516 Speaker 1: decade ago. Happened in twenty fifteen played a key role 20 00:01:39,636 --> 00:01:42,156 Speaker 1: in launching a bunch of the companies that are now 21 00:01:42,196 --> 00:01:43,756 Speaker 1: working on humanoid robots. 22 00:01:46,116 --> 00:01:50,076 Speaker 2: The Darker Robotics Challenge was a challenge that was created 23 00:01:50,116 --> 00:01:54,076 Speaker 2: in the wake of the Fukushima disaster. Lukashima happened in 24 00:01:54,116 --> 00:01:56,636 Speaker 2: the you know, there was a meltdown in the nuclear 25 00:01:56,636 --> 00:01:59,476 Speaker 2: reactors and it was unsafe for people to go in, 26 00:02:00,236 --> 00:02:04,356 Speaker 2: and in essentially what happened was they needed a robot 27 00:02:04,396 --> 00:02:07,796 Speaker 2: to go in to sort of prevent a broader catastrophe. 28 00:02:08,596 --> 00:02:11,436 Speaker 2: And as they went out to the robotics community, the 29 00:02:11,476 --> 00:02:15,636 Speaker 2: idea was calling all roboticists, we need a robot to 30 00:02:15,676 --> 00:02:18,996 Speaker 2: go in and to help out here. And there was 31 00:02:19,036 --> 00:02:21,356 Speaker 2: no robots that could do all the different tasks that 32 00:02:21,396 --> 00:02:24,276 Speaker 2: were required to actually get to the melted down reactors. 33 00:02:24,356 --> 00:02:25,956 Speaker 2: So you had to go down steps, you had to 34 00:02:25,996 --> 00:02:28,796 Speaker 2: open doors, had to do a whole range of things. 35 00:02:29,436 --> 00:02:34,396 Speaker 2: And in the wake of that, basically what DARPA said was, certainly, 36 00:02:34,396 --> 00:02:36,916 Speaker 2: there's got to be the technology to enable us to 37 00:02:36,956 --> 00:02:41,236 Speaker 2: have much more versatile robots for natural disaster relief so 38 00:02:41,356 --> 00:02:44,996 Speaker 2: this never happens again. So out of that, Darker created 39 00:02:44,996 --> 00:02:48,396 Speaker 2: something called the Darker Robotics Challenge, and so there's a 40 00:02:48,476 --> 00:02:51,476 Speaker 2: variety of teams around the world that were put together 41 00:02:51,756 --> 00:02:57,156 Speaker 2: to build these general purpose robots. And the team that 42 00:02:57,156 --> 00:03:00,076 Speaker 2: we came out of was the Nasagen and the Space 43 00:03:00,116 --> 00:03:02,956 Speaker 2: Center team working on a robot called Valkyrie. 44 00:03:03,436 --> 00:03:05,636 Speaker 1: So I want to go back to this moment when 45 00:03:05,716 --> 00:03:10,156 Speaker 1: the DARPA Challenge and there's this big final test, and like, 46 00:03:10,556 --> 00:03:14,756 Speaker 1: what we have from it is not some incredible breakthrough, 47 00:03:14,756 --> 00:03:18,636 Speaker 1: but like a blooper reel of robots, what are they doing? 48 00:03:18,716 --> 00:03:21,356 Speaker 1: Falling downstairs or driving cars into walls or. 49 00:03:21,276 --> 00:03:26,156 Speaker 2: Something pretty much. I mean, the blooper reels make it 50 00:03:26,196 --> 00:03:29,996 Speaker 2: seem worse than it was, but but yeah, we had 51 00:03:30,396 --> 00:03:33,956 Speaker 2: we basically the realization was the technology's not there. It's 52 00:03:33,956 --> 00:03:37,396 Speaker 2: going to take time to continue to mature until it 53 00:03:37,436 --> 00:03:40,076 Speaker 2: can get to the point where it's actually commercially viable. 54 00:03:40,676 --> 00:03:43,756 Speaker 1: And so it's interesting, it's super interesting right that this 55 00:03:43,836 --> 00:03:47,876 Speaker 1: moment is not the like beginning of some humanoid robot winter, 56 00:03:48,236 --> 00:03:52,156 Speaker 1: but rather the beginning of this humanoid robot industry. Like 57 00:03:52,196 --> 00:03:55,436 Speaker 1: how does that work? Like how do people whatever start 58 00:03:55,436 --> 00:03:58,956 Speaker 1: companiesn't get money out of this seemingly disappointing moment. 59 00:04:00,036 --> 00:04:02,596 Speaker 2: Well, I think it actually was in winter, you know, 60 00:04:03,156 --> 00:04:05,676 Speaker 2: when we started in you know, the company was incorporated 61 00:04:05,716 --> 00:04:08,796 Speaker 2: in twenty fifteen and we started in twenty sixteen, and 62 00:04:09,476 --> 00:04:12,876 Speaker 2: for the most part, nobody wanted to talk about humanoids 63 00:04:12,956 --> 00:04:15,876 Speaker 2: and nobody was really paying attention to it. There was 64 00:04:15,916 --> 00:04:18,196 Speaker 2: a handful of folks that I sort of think of 65 00:04:18,236 --> 00:04:21,876 Speaker 2: as the true believers that were really working on this 66 00:04:21,996 --> 00:04:24,196 Speaker 2: problem and thought, you know, we don't care how long 67 00:04:24,236 --> 00:04:25,956 Speaker 2: this is going to take. We're just going to keep 68 00:04:25,956 --> 00:04:30,276 Speaker 2: working on this no matter what. But for the most part, 69 00:04:30,516 --> 00:04:33,876 Speaker 2: the entire robotics industry was very anti humanoids coming out 70 00:04:33,876 --> 00:04:36,396 Speaker 2: of the Darker Robotics Challenge, and in fact, there was 71 00:04:36,476 --> 00:04:39,516 Speaker 2: many people that were saying they'll never be viable, like 72 00:04:39,596 --> 00:04:42,436 Speaker 2: why would you ever use a humanoid robot. They're too complicated, 73 00:04:42,636 --> 00:04:46,796 Speaker 2: they're way too expensive, and you'll always use a simpler robot. 74 00:04:47,236 --> 00:04:49,476 Speaker 2: So actually, most of the people that we met when 75 00:04:49,476 --> 00:04:52,756 Speaker 2: we decided to start Uptronic were doubters, and we're saying, 76 00:04:53,356 --> 00:04:57,996 Speaker 2: humanoids will never make sense. We'll use these special purpose robots. 77 00:04:58,076 --> 00:05:00,756 Speaker 2: You know, maybe in fifty years humanoids will make sense, 78 00:05:00,796 --> 00:05:01,836 Speaker 2: but not for a long time. 79 00:05:02,996 --> 00:05:07,396 Speaker 1: I mean special purpose robots is a pretty compelling case, right, 80 00:05:07,516 --> 00:05:12,196 Speaker 1: like whatever we all have, you know, dishwashing robots and 81 00:05:12,276 --> 00:05:15,636 Speaker 1: clothes cleaning robots in our houses, and like, you know, 82 00:05:15,756 --> 00:05:18,956 Speaker 1: wheels seem way easier than legs for lots of things. 83 00:05:18,996 --> 00:05:22,156 Speaker 1: And obviously there have been robot arms for what I 84 00:05:22,196 --> 00:05:25,196 Speaker 1: don't know, seventy years now or something like. So robots 85 00:05:25,236 --> 00:05:27,676 Speaker 1: in a way are all around us. Why would you 86 00:05:27,676 --> 00:05:29,716 Speaker 1: build a machine that looks like a dude when that's 87 00:05:29,796 --> 00:05:30,916 Speaker 1: like wildly hard. 88 00:05:30,756 --> 00:05:34,636 Speaker 2: Right, Yeah, I mean I was naive coming out of 89 00:05:34,676 --> 00:05:38,036 Speaker 2: graduate school, and to me it seemed obvious. And the 90 00:05:38,076 --> 00:05:39,796 Speaker 2: way I used to think about it was you could 91 00:05:39,796 --> 00:05:42,236 Speaker 2: either have thousands of robots that do one thing, or 92 00:05:42,316 --> 00:05:44,876 Speaker 2: you could build one robot that could do thousands of 93 00:05:44,916 --> 00:05:48,276 Speaker 2: different things. And when I would talk about this with Nick, 94 00:05:48,396 --> 00:05:51,036 Speaker 2: my co founder, Nick would say, look, you can either, 95 00:05:51,476 --> 00:05:54,996 Speaker 2: you know, invest all this engineering in each of these 96 00:05:55,036 --> 00:05:58,556 Speaker 2: sort of narrow solutions, or yes, a humanoid robot, a 97 00:05:58,636 --> 00:06:02,076 Speaker 2: viable humanoid robot could take you years, it could take 98 00:06:02,076 --> 00:06:04,756 Speaker 2: you a decade, But once you invest all this time 99 00:06:04,796 --> 00:06:08,236 Speaker 2: in this single platform, then you can reap the benefits 100 00:06:08,276 --> 00:06:13,276 Speaker 2: of that across You can spread the research of that 101 00:06:13,556 --> 00:06:15,356 Speaker 2: across many different applications. 102 00:06:15,996 --> 00:06:18,436 Speaker 1: I mean, is there not a middle case where there's 103 00:06:18,516 --> 00:06:22,836 Speaker 1: like some core kind of functionality that you develop that 104 00:06:22,916 --> 00:06:25,556 Speaker 1: works across many different types of robots. Is that a 105 00:06:25,636 --> 00:06:28,916 Speaker 1: less straw many version of the non humanoid robot kind 106 00:06:28,956 --> 00:06:29,556 Speaker 1: of argument? 107 00:06:29,916 --> 00:06:32,436 Speaker 2: I think there could be. But I came into robotics 108 00:06:32,516 --> 00:06:35,196 Speaker 2: and basically just saw a lot of challenges with the 109 00:06:35,236 --> 00:06:39,516 Speaker 2: business models. So you build this special purpose robot, you 110 00:06:39,716 --> 00:06:42,796 Speaker 2: custom program the robot. In the industrial space, you can 111 00:06:42,836 --> 00:06:46,236 Speaker 2: spend six times the price of the robot on just 112 00:06:46,276 --> 00:06:49,596 Speaker 2: systems integration, and then the robot just does one thing. 113 00:06:50,116 --> 00:06:51,916 Speaker 2: So this idea that you could have a much more 114 00:06:52,036 --> 00:06:55,396 Speaker 2: versatile robot, to me, seemed obvious, like, if robotics is 115 00:06:55,436 --> 00:06:57,756 Speaker 2: going to scale, we have to have much more versatile 116 00:06:57,836 --> 00:07:00,116 Speaker 2: robots that we've had in the past. So if you 117 00:07:00,196 --> 00:07:02,156 Speaker 2: sort of think of that as the premise is we 118 00:07:02,236 --> 00:07:05,156 Speaker 2: need more versatile robots, then the question is, well, how 119 00:07:05,156 --> 00:07:08,276 Speaker 2: do you get there and what does versatility mean? And 120 00:07:09,116 --> 00:07:12,956 Speaker 2: that's where it led me to the humanoid making a 121 00:07:12,956 --> 00:07:15,916 Speaker 2: lot of sense, because if you have to modify the 122 00:07:16,076 --> 00:07:18,756 Speaker 2: environment for every new task that the robot can do, 123 00:07:19,396 --> 00:07:21,316 Speaker 2: you run into the same problem that we had in 124 00:07:21,396 --> 00:07:24,996 Speaker 2: sort of classical robotics. But if the robot can retrofit 125 00:07:25,156 --> 00:07:27,876 Speaker 2: into the environment such that you don't have to change 126 00:07:27,956 --> 00:07:30,756 Speaker 2: or modify the environment for every new task that the 127 00:07:30,836 --> 00:07:33,876 Speaker 2: robot can do, then it seemed to me that this 128 00:07:33,916 --> 00:07:37,156 Speaker 2: would maybe be the key unlock for robotics to actually 129 00:07:37,196 --> 00:07:38,396 Speaker 2: scale to the masses. 130 00:07:38,916 --> 00:07:40,956 Speaker 1: The demand would be infinite. If you had a thing 131 00:07:40,996 --> 00:07:42,876 Speaker 1: that was the size and shape of a person with 132 00:07:43,036 --> 00:07:46,556 Speaker 1: arms and legs like, scale would be off the charts, 133 00:07:46,556 --> 00:07:49,316 Speaker 1: And presumably that's what drives costs down, and that's like 134 00:07:49,476 --> 00:07:50,956 Speaker 1: the good flywheel. 135 00:07:50,516 --> 00:07:51,956 Speaker 2: Right, yeah, exactly. 136 00:07:52,436 --> 00:07:55,036 Speaker 1: Okay, so you had this big idea about humanoid robots 137 00:07:55,476 --> 00:07:57,676 Speaker 1: and you started a company. But at the moment you 138 00:07:57,716 --> 00:08:00,836 Speaker 1: started the humanoid robot company, the prevailing sentiment was like, 139 00:08:01,116 --> 00:08:03,956 Speaker 1: deeply skeptical, what happened? 140 00:08:03,996 --> 00:08:06,196 Speaker 2: What did you do? Well? A handful of us kept 141 00:08:06,236 --> 00:08:08,996 Speaker 2: working on it, so I didn't know any better. You know, 142 00:08:09,236 --> 00:08:11,196 Speaker 2: sometimes it's better that you don't know any better. I 143 00:08:11,556 --> 00:08:16,076 Speaker 2: thought humanoids were really cool, and I thought that it 144 00:08:16,196 --> 00:08:18,156 Speaker 2: just seemed it made sense to me that, you know, 145 00:08:18,476 --> 00:08:20,876 Speaker 2: how are we going to get to millions of robots 146 00:08:21,316 --> 00:08:24,596 Speaker 2: that are working and you know, with and around humans 147 00:08:24,596 --> 00:08:26,596 Speaker 2: and all these environments, and to me, this seemed like 148 00:08:26,636 --> 00:08:28,876 Speaker 2: the only way that that was going to happen. And 149 00:08:28,916 --> 00:08:32,036 Speaker 2: the way I looked at it was, even if we failed, 150 00:08:32,036 --> 00:08:35,076 Speaker 2: this was a worthy pursuit and I would be proud 151 00:08:35,076 --> 00:08:38,196 Speaker 2: that I tried to do it. And so the way 152 00:08:38,236 --> 00:08:40,836 Speaker 2: that we did it was we bootstrapped the company. There 153 00:08:40,916 --> 00:08:43,876 Speaker 2: was no investors that were willing to invest in humanoid 154 00:08:43,956 --> 00:08:48,596 Speaker 2: robots at the time that we got started, especially for hardware, 155 00:08:48,636 --> 00:08:51,156 Speaker 2: which we can talk about that as we move forward. 156 00:08:52,636 --> 00:08:55,876 Speaker 2: And so we bootstrapped the company and we basically got 157 00:08:55,956 --> 00:08:58,276 Speaker 2: paid to build robots for a lot of different folks. 158 00:08:58,996 --> 00:09:01,756 Speaker 2: And for the first five years of the company, we 159 00:09:01,916 --> 00:09:04,836 Speaker 2: just built the company on revenue. We would get project 160 00:09:04,876 --> 00:09:07,716 Speaker 2: after project and somehow never died. 161 00:09:08,116 --> 00:09:10,276 Speaker 1: Like what kind of were you taking at that time? 162 00:09:10,316 --> 00:09:11,756 Speaker 1: What's an one example, Well, we. 163 00:09:12,116 --> 00:09:14,436 Speaker 2: Had our first contract was with NASA, So we had 164 00:09:14,436 --> 00:09:17,196 Speaker 2: a contract with NASA to build Valkyrie two, to take 165 00:09:17,236 --> 00:09:19,636 Speaker 2: the lessons learned from the DARPA Challenge and build the 166 00:09:19,676 --> 00:09:23,436 Speaker 2: next iteration of Valkyrie. We were really kind of pioneering 167 00:09:23,676 --> 00:09:27,356 Speaker 2: new ways of building these systems. So US Special Forces 168 00:09:27,396 --> 00:09:30,316 Speaker 2: ended up coming to us about a year in and said, hey, 169 00:09:30,316 --> 00:09:32,996 Speaker 2: we want to do Ironman suits, and our view was 170 00:09:33,076 --> 00:09:35,596 Speaker 2: this was kind of a humanoid robot that you wear. 171 00:09:36,196 --> 00:09:39,916 Speaker 2: We worked in automotive. We helped build humanoid robots for 172 00:09:40,116 --> 00:09:44,956 Speaker 2: a couple you know, major companies that are still working 173 00:09:44,996 --> 00:09:47,476 Speaker 2: on these things today, and we would help sort of 174 00:09:47,476 --> 00:09:50,636 Speaker 2: pioneer new ways of building their platforms. So we've done 175 00:09:50,756 --> 00:09:54,036 Speaker 2: fifteen unique robots since we got started, and we're now 176 00:09:54,076 --> 00:09:56,596 Speaker 2: in a ninth iteration of humanoid and I have only 177 00:09:56,676 --> 00:09:58,196 Speaker 2: raised money in the last couple of years. 178 00:09:59,516 --> 00:10:04,956 Speaker 1: So where'd the idea to build a robot for fifty 179 00:10:05,036 --> 00:10:07,996 Speaker 1: thousand dollars come from? 180 00:10:08,396 --> 00:10:10,476 Speaker 2: The idea of where to build a robot for fifty 181 00:10:10,516 --> 00:10:13,316 Speaker 2: thousand dollars was what will it take for these robots 182 00:10:13,316 --> 00:10:16,556 Speaker 2: to be economic and reach mass market? So, you know, 183 00:10:17,436 --> 00:10:21,316 Speaker 2: when we got started, sort of my view was, you know, 184 00:10:21,796 --> 00:10:25,636 Speaker 2: what will a truly viable commercial humanoid look like and 185 00:10:26,556 --> 00:10:28,316 Speaker 2: what would the bomb costs need to be for this 186 00:10:28,396 --> 00:10:30,556 Speaker 2: to make sense? And if you sort of just do that, 187 00:10:30,596 --> 00:10:34,636 Speaker 2: bottoms up about fifty thousand dollars for the robot, you're 188 00:10:34,676 --> 00:10:37,596 Speaker 2: in the money for mass market. You can still do 189 00:10:37,676 --> 00:10:40,756 Speaker 2: some tasks in a very economic way at even one 190 00:10:40,796 --> 00:10:43,556 Speaker 2: hundred thousand or one hundred and fifty thousand, but fifty 191 00:10:43,596 --> 00:10:46,396 Speaker 2: thousand was the goal. This has now been blown by 192 00:10:46,516 --> 00:10:48,836 Speaker 2: by some of the new entrepreneurs that are coming out 193 00:10:48,876 --> 00:10:51,836 Speaker 2: that are talking about sub twenty thousand dollars. But it 194 00:10:52,556 --> 00:10:56,036 Speaker 2: never made sense to me that robots were as expensive 195 00:10:56,076 --> 00:10:58,956 Speaker 2: as they were at the time. If you look at 196 00:10:58,996 --> 00:11:02,236 Speaker 2: a humanoid compared to a car, there's about four percent 197 00:11:02,276 --> 00:11:06,116 Speaker 2: the raw material by weight. So one of our robots 198 00:11:06,116 --> 00:11:08,756 Speaker 2: there's about three hundred dollars of raw aluminum, which is 199 00:11:08,796 --> 00:11:13,916 Speaker 2: the the base metal of the system. And so it 200 00:11:14,396 --> 00:11:17,076 Speaker 2: never made sense to me that these robots would need 201 00:11:17,116 --> 00:11:19,716 Speaker 2: to be any more than fifty thousand dollars. As you 202 00:11:19,756 --> 00:11:21,796 Speaker 2: could reach scale and as you can start to think 203 00:11:21,836 --> 00:11:24,436 Speaker 2: about new ways of building them in similar ways that 204 00:11:24,476 --> 00:11:25,556 Speaker 2: we build other machines. 205 00:11:26,836 --> 00:11:29,716 Speaker 1: So you decide you want to build a fifty thousand 206 00:11:29,756 --> 00:11:34,956 Speaker 1: dollars robot, Like, what do you actually do to do that? Like, 207 00:11:35,516 --> 00:11:37,396 Speaker 1: how do you go from having an idea of building 208 00:11:37,396 --> 00:11:39,556 Speaker 1: a fifty thousand dollars robot to having a fifty thousand 209 00:11:39,556 --> 00:11:40,156 Speaker 1: dollars robot. 210 00:11:40,596 --> 00:11:44,156 Speaker 2: Well, you iterate until you solve any problem. So what 211 00:11:44,196 --> 00:11:46,476 Speaker 2: we would do is basically we would get a project 212 00:11:46,556 --> 00:11:51,836 Speaker 2: or a contract to build a robot, and we would 213 00:11:52,196 --> 00:11:56,476 Speaker 2: put a lot of different ideas into those designs. In 214 00:11:56,716 --> 00:11:58,836 Speaker 2: early days, it was all about performance. How can you 215 00:11:58,836 --> 00:12:01,836 Speaker 2: get the robot to just do these tasks, to stand 216 00:12:01,916 --> 00:12:04,876 Speaker 2: and have a battery life that's long enough. And then 217 00:12:04,876 --> 00:12:07,836 Speaker 2: as we kept evolving, we started to really focus on 218 00:12:09,436 --> 00:12:14,356 Speaker 2: cost in addition and scalability and assemblability and robustness. And 219 00:12:14,396 --> 00:12:17,756 Speaker 2: the key building block to drive cost and performance is 220 00:12:17,756 --> 00:12:21,276 Speaker 2: the actuator. So I mentioned we've done nine iterations of humanoids, 221 00:12:21,276 --> 00:12:24,356 Speaker 2: but we've done sixty iterations of electric actuators. 222 00:12:24,716 --> 00:12:27,676 Speaker 1: Actuators are basically the thing that makes the robot. 223 00:12:27,516 --> 00:12:30,836 Speaker 2: Move, right, Yeah, the muscles of a robot. 224 00:12:31,316 --> 00:12:33,716 Speaker 1: So where are you now? Tell me about Apollo. 225 00:12:34,756 --> 00:12:36,996 Speaker 2: Yeah, we're now at an exciting point. We have about 226 00:12:37,036 --> 00:12:40,956 Speaker 2: one hundred and seventy employees at Electronic. We are piloting 227 00:12:41,396 --> 00:12:45,276 Speaker 2: these robots right now. So I think the entire industry 228 00:12:45,356 --> 00:12:48,676 Speaker 2: is still in the pilot stage overall. There's some commercial 229 00:12:48,756 --> 00:12:51,836 Speaker 2: orders that are happening, but still early days for humanoids. 230 00:12:52,956 --> 00:12:55,876 Speaker 2: We're working with a handful of really great partners, folks 231 00:12:55,876 --> 00:12:59,676 Speaker 2: like Mercedes and GXO, and we're getting the robots out 232 00:12:59,676 --> 00:13:03,156 Speaker 2: into the real world and in pretty big ways, and 233 00:13:03,276 --> 00:13:06,396 Speaker 2: so we'll have more announcements over this year. We have 234 00:13:06,436 --> 00:13:08,916 Speaker 2: a big partnership with Google Deep Mind, which is something 235 00:13:08,916 --> 00:13:11,676 Speaker 2: that I always dreamed of coming out of graduate school. 236 00:13:11,716 --> 00:13:13,916 Speaker 2: We had a lot of respect for the folks that 237 00:13:14,436 --> 00:13:17,076 Speaker 2: at Google, and they have a whole history and legacy 238 00:13:17,676 --> 00:13:21,836 Speaker 2: in the humanoid space as well. And basically right now 239 00:13:21,836 --> 00:13:25,316 Speaker 2: we're getting these robots out into the world and gearing 240 00:13:25,476 --> 00:13:29,156 Speaker 2: up for you know, real commercialization, which we expect to 241 00:13:29,196 --> 00:13:31,276 Speaker 2: happen in twenty twenty six. 242 00:13:32,156 --> 00:13:33,276 Speaker 1: What's the robot look like? 243 00:13:33,836 --> 00:13:36,876 Speaker 2: The robot kind of looks like a superhero. Maybe this 244 00:13:36,916 --> 00:13:39,396 Speaker 2: has been kind of the idea that we've had from 245 00:13:39,436 --> 00:13:43,076 Speaker 2: the beginning. It's got two eyes and a face. It's 246 00:13:43,116 --> 00:13:46,156 Speaker 2: five foot eight, weighs one hundred and sixty pounds, has 247 00:13:46,476 --> 00:13:50,116 Speaker 2: four hour swappable batteries. Yeah, it's got a screen on 248 00:13:50,156 --> 00:13:53,716 Speaker 2: its chest and a face and that's about it. 249 00:13:53,796 --> 00:13:57,756 Speaker 1: Two arms, two legs. What's to have in the way 250 00:13:57,756 --> 00:14:00,156 Speaker 1: of hands. 251 00:14:00,156 --> 00:14:03,156 Speaker 2: So it has hands, has five fingered hands. You know, 252 00:14:03,516 --> 00:14:05,916 Speaker 2: there's these debates that I think of his false debates 253 00:14:05,956 --> 00:14:08,516 Speaker 2: in the humanoid space. So a lot of people when 254 00:14:08,516 --> 00:14:11,476 Speaker 2: they sort of knock humanoids and the viability of humanoids. 255 00:14:11,476 --> 00:14:13,516 Speaker 2: It usually has to do with do they need legs 256 00:14:13,556 --> 00:14:15,996 Speaker 2: and do they need hands? And the answer to that 257 00:14:16,076 --> 00:14:18,476 Speaker 2: question for me is no, they don't. It's a robot, 258 00:14:18,556 --> 00:14:21,716 Speaker 2: and robots are modular. So we can put Apollo on 259 00:14:21,796 --> 00:14:24,156 Speaker 2: any mobility base. We can put it on wheels, we 260 00:14:24,156 --> 00:14:27,436 Speaker 2: could put it on tracks, we could stationary amount the 261 00:14:27,476 --> 00:14:31,076 Speaker 2: upper torso of Apollo. And the same thing's true for 262 00:14:31,196 --> 00:14:33,876 Speaker 2: the hands of the grippers. We can use parallel grippers, 263 00:14:34,876 --> 00:14:36,516 Speaker 2: or we can use five fingered hands. 264 00:14:37,036 --> 00:14:39,796 Speaker 1: Hands are like a whole thing, right, Like hands are? 265 00:14:40,436 --> 00:14:42,916 Speaker 1: Is it partly because they're so hard? Like what's going 266 00:14:42,916 --> 00:14:44,036 Speaker 1: on with robots and hands? 267 00:14:44,116 --> 00:14:46,356 Speaker 2: Yeah, it turns out hands are a whole thing. This 268 00:14:46,436 --> 00:14:48,876 Speaker 2: is another one of those things that you know. It's 269 00:14:48,956 --> 00:14:51,876 Speaker 2: it's almost better that you don't understand the complexity before 270 00:14:51,876 --> 00:14:54,676 Speaker 2: you get into it, or else you might not have 271 00:14:54,756 --> 00:14:58,276 Speaker 2: done it in the first place. Ninety eight percent of 272 00:14:58,396 --> 00:15:02,236 Speaker 2: all tasks that humans do are done with our hands. 273 00:15:02,596 --> 00:15:06,836 Speaker 2: So there are narrow things that humanoids can do without 274 00:15:06,956 --> 00:15:10,876 Speaker 2: more dexterity, but it's very limited relative to the whole 275 00:15:11,036 --> 00:15:13,876 Speaker 2: sort of you know, all the different types of tasks 276 00:15:13,916 --> 00:15:17,476 Speaker 2: that humans do. Most of the things we do involve 277 00:15:17,476 --> 00:15:19,716 Speaker 2: our hands, and certainly in the industrial space, most of 278 00:15:19,716 --> 00:15:23,236 Speaker 2: the work is done with hands, So solving the end 279 00:15:23,236 --> 00:15:26,356 Speaker 2: defector or the hand problem is a big deal. There's 280 00:15:26,396 --> 00:15:28,916 Speaker 2: a lot of different debates about what you need and 281 00:15:28,996 --> 00:15:31,756 Speaker 2: how you get something that can actually perform industrial work. 282 00:15:32,876 --> 00:15:36,516 Speaker 2: You know, we've chosen the five finger hand route and 283 00:15:36,556 --> 00:15:40,196 Speaker 2: we're working across the space to really make some big 284 00:15:40,236 --> 00:15:41,196 Speaker 2: advancements there. 285 00:15:41,516 --> 00:15:44,596 Speaker 1: Overall, it's part of the trade off, Like I could 286 00:15:44,596 --> 00:15:47,436 Speaker 1: build whatever two what do you call them prongs? Like 287 00:15:47,476 --> 00:15:50,516 Speaker 1: if you had two fingers, basically like a claw, Like 288 00:15:50,556 --> 00:15:52,236 Speaker 1: you could do a lot of things with a claw. 289 00:15:52,276 --> 00:15:56,076 Speaker 1: Presumably it would be way easier, but you couldn't do everything. 290 00:15:56,156 --> 00:15:58,756 Speaker 1: Is it kind of like what are you optimizing for? 291 00:15:58,836 --> 00:16:00,716 Speaker 1: And sort of how much payoff now versus how much 292 00:16:00,756 --> 00:16:01,356 Speaker 1: payoff later? 293 00:16:01,836 --> 00:16:05,916 Speaker 2: Yeah, I think that's exactly right. It's you know, versatility, 294 00:16:06,836 --> 00:16:11,556 Speaker 2: you know, compared to robustness and cost. Basically, how much 295 00:16:11,596 --> 00:16:13,596 Speaker 2: complexity do you want to have on the system. And 296 00:16:14,076 --> 00:16:16,676 Speaker 2: you know, for these robots to be really viable in 297 00:16:16,676 --> 00:16:19,196 Speaker 2: the long run, especially in the industrial space, they've got 298 00:16:19,236 --> 00:16:22,476 Speaker 2: to be able to operate two shifts a day minimum. 299 00:16:22,956 --> 00:16:25,316 Speaker 2: Really you know, twenty two hours a day, seven days 300 00:16:25,356 --> 00:16:27,996 Speaker 2: a week. But solving that problem in a hand, so 301 00:16:28,116 --> 00:16:30,756 Speaker 2: just getting the performance of the hand first, but then 302 00:16:30,796 --> 00:16:33,796 Speaker 2: the robustness for them to do that type of work 303 00:16:34,156 --> 00:16:35,996 Speaker 2: is the next piece. And that's a trade off of 304 00:16:36,076 --> 00:16:38,116 Speaker 2: performance and complexity and cost. 305 00:16:39,116 --> 00:16:42,436 Speaker 1: Because like it gets delicate, right, presumably the fingers so 306 00:16:42,436 --> 00:16:46,796 Speaker 1: to speak, would be fragile, right, Yeah, they can bring 307 00:16:46,836 --> 00:16:47,876 Speaker 1: easy to break. 308 00:16:47,836 --> 00:16:50,356 Speaker 2: Yeah, yeah, yeah, and you've got to maintain it and 309 00:16:50,476 --> 00:16:53,436 Speaker 2: you've got to support those systems and fix them out 310 00:16:53,476 --> 00:16:55,996 Speaker 2: in the field. And so what's the trade off there? 311 00:16:56,036 --> 00:16:58,396 Speaker 2: And that's a whole trade space that we've been working 312 00:16:58,436 --> 00:16:59,316 Speaker 2: on over a long time. 313 00:17:00,476 --> 00:17:03,556 Speaker 1: So we've been talking about hardware. Let's talk about the 314 00:17:03,596 --> 00:17:06,676 Speaker 1: software side. What's happening with that. 315 00:17:08,076 --> 00:17:10,796 Speaker 2: A lot's happening on that side. I think we're really 316 00:17:10,796 --> 00:17:13,996 Speaker 2: in a really you know, we're an exciting point for 317 00:17:14,116 --> 00:17:17,676 Speaker 2: robotics overall. Think of the AI as really the last 318 00:17:17,676 --> 00:17:19,996 Speaker 2: piece of the puzzle. So, you know, we've had the 319 00:17:20,036 --> 00:17:24,596 Speaker 2: ability to build complex robots for a relatively long time. 320 00:17:25,116 --> 00:17:27,836 Speaker 2: We're just now, you know, really figuring out how to 321 00:17:28,316 --> 00:17:31,836 Speaker 2: take the lessons from automotive and consumer electronics and and 322 00:17:31,956 --> 00:17:35,356 Speaker 2: build much more economic systems, and we've had some advancements 323 00:17:35,356 --> 00:17:39,036 Speaker 2: and things like motors and batteries and compute and sensors 324 00:17:39,076 --> 00:17:41,196 Speaker 2: that have all sort of built up to this moment. 325 00:17:42,036 --> 00:17:44,516 Speaker 2: But the final piece of the puzzle was the AI 326 00:17:44,676 --> 00:17:48,356 Speaker 2: and the intelligence and essentially the way to think about it. 327 00:17:48,396 --> 00:17:50,716 Speaker 2: And I think Jensen does a great job of explaining this. 328 00:17:50,876 --> 00:17:55,436 Speaker 1: But the advances Jensen Wong from Nvidio, Yeah, that's right. 329 00:17:55,756 --> 00:17:58,116 Speaker 1: I feel like Jensen is not quite the Elon level 330 00:17:58,196 --> 00:18:01,796 Speaker 1: of one name, household name, but sorry, go on, he's 331 00:18:01,836 --> 00:18:02,356 Speaker 1: getting there. 332 00:18:02,556 --> 00:18:03,316 Speaker 2: Yeah, he should be. 333 00:18:04,156 --> 00:18:05,236 Speaker 1: He should be. He should be. 334 00:18:07,316 --> 00:18:10,996 Speaker 2: Basically, the advancements in geni AI turn out to apply 335 00:18:11,236 --> 00:18:15,756 Speaker 2: very well to robotics, and particularly to humanoid robotics, So 336 00:18:15,796 --> 00:18:19,596 Speaker 2: you can basically map human movement and trajectories from humans 337 00:18:19,636 --> 00:18:22,956 Speaker 2: doing things and build big data sets and use that 338 00:18:23,076 --> 00:18:26,596 Speaker 2: to train robots to do similar tasks in similar environments. 339 00:18:26,916 --> 00:18:31,076 Speaker 2: And these transformer architectures that we're using and generative AI 340 00:18:31,236 --> 00:18:34,076 Speaker 2: actually apply very well to robotics, and so this has 341 00:18:34,116 --> 00:18:37,516 Speaker 2: been a big sort of breakthrough moment for robotics. And 342 00:18:37,556 --> 00:18:41,156 Speaker 2: so I think as an industry as a whole, everybody's 343 00:18:41,196 --> 00:18:44,596 Speaker 2: really excited right now because we're reaching new heights and 344 00:18:44,636 --> 00:18:47,596 Speaker 2: we're able to do things that we dreamed about doing 345 00:18:47,636 --> 00:18:50,036 Speaker 2: with robots only you know, even a few years ago, 346 00:18:50,156 --> 00:18:53,276 Speaker 2: are now possible, and we're seeing a really rapid advancement 347 00:18:53,316 --> 00:18:54,476 Speaker 2: in performance overall. 348 00:18:54,916 --> 00:18:57,236 Speaker 1: What's an example of a thing that you could only 349 00:18:57,316 --> 00:18:59,996 Speaker 1: dream of a few years ago that robots can do now? 350 00:19:00,556 --> 00:19:03,436 Speaker 2: I think it's more dexterity and versatility. So just the 351 00:19:03,596 --> 00:19:06,476 Speaker 2: range of things that you can do. So the challenge 352 00:19:06,476 --> 00:19:10,236 Speaker 2: for robotics was that each news even if you build 353 00:19:10,236 --> 00:19:12,836 Speaker 2: something like a humanoid robot, and this is true for us, 354 00:19:13,156 --> 00:19:18,116 Speaker 2: and say you build an application to pick boxes off 355 00:19:18,156 --> 00:19:21,716 Speaker 2: of a palette and place those boxes onto a conveyor. Well, 356 00:19:21,716 --> 00:19:24,956 Speaker 2: you hand build that application, and you know, maybe takes 357 00:19:24,996 --> 00:19:27,676 Speaker 2: you eighteen months to sort of ring that out and 358 00:19:27,796 --> 00:19:30,476 Speaker 2: get it to a certain amount of robustness. Well, now 359 00:19:30,516 --> 00:19:33,476 Speaker 2: you want to do the reverse of that and pick 360 00:19:33,516 --> 00:19:36,916 Speaker 2: off of a conveyor and palletize something that will take 361 00:19:36,916 --> 00:19:39,636 Speaker 2: you the same amount of time that it took you 362 00:19:39,716 --> 00:19:41,636 Speaker 2: to build the initial application. 363 00:19:41,516 --> 00:19:43,636 Speaker 1: You have to write basically have to write a whole 364 00:19:43,676 --> 00:19:46,436 Speaker 1: other piece of software. You have to start from scratch 365 00:19:46,436 --> 00:19:47,276 Speaker 1: almost yeah. 366 00:19:47,156 --> 00:19:51,716 Speaker 2: Yeah, exactly. And so basically what is happening now is 367 00:19:51,716 --> 00:19:55,196 Speaker 2: that we now have these much more sort of general 368 00:19:55,556 --> 00:19:58,396 Speaker 2: models that where you can collect a lot of data 369 00:19:58,436 --> 00:20:01,356 Speaker 2: at the top layer, and so each new task that 370 00:20:01,436 --> 00:20:04,876 Speaker 2: you want to perform actually takes less and less incremental 371 00:20:04,876 --> 00:20:08,076 Speaker 2: amount of work. So what it's opening up now is 372 00:20:08,196 --> 00:20:11,636 Speaker 2: more dexterous applications. 373 00:20:13,316 --> 00:20:16,436 Speaker 1: Still to come. On the show how Jeff's grandparents inspired 374 00:20:16,476 --> 00:20:27,916 Speaker 1: his work on robots, So you were talking about using 375 00:20:27,956 --> 00:20:32,836 Speaker 1: the transformer model. That has been the you know, breakthrough 376 00:20:32,876 --> 00:20:37,556 Speaker 1: that has driven large language models in training robots essentially. 377 00:20:38,236 --> 00:20:41,836 Speaker 1: I mean, of course, a key sort of serendipitous thing 378 00:20:41,836 --> 00:20:46,156 Speaker 1: that happened with language models was there is this crazy 379 00:20:46,276 --> 00:20:49,596 Speaker 1: large data set of words and pictures which is the Internet, 380 00:20:49,956 --> 00:20:54,236 Speaker 1: and there's not an analogous data set for the physical world. Right, Yeah, 381 00:20:54,756 --> 00:20:58,676 Speaker 1: it seems like that is is that the rate limiting step? 382 00:20:58,796 --> 00:21:02,916 Speaker 1: Is that the big problem in sort of AI for robots? 383 00:21:03,236 --> 00:21:05,396 Speaker 2: Yeah, I mean there's a lot of work that's still 384 00:21:05,436 --> 00:21:08,396 Speaker 2: happening at the research level for you know, how can 385 00:21:08,436 --> 00:21:11,876 Speaker 2: you pull that kind of data from from videos so 386 00:21:11,916 --> 00:21:14,236 Speaker 2: you can think of big data sets of humans doing 387 00:21:14,276 --> 00:21:17,556 Speaker 2: things that could be really interesting to train robots in 388 00:21:17,596 --> 00:21:20,836 Speaker 2: the future. Interest and that that will come into play 389 00:21:20,836 --> 00:21:23,636 Speaker 2: over time. But yeah, it's the chicken or the egg problem. 390 00:21:23,796 --> 00:21:27,516 Speaker 2: And data is the one of the key things that 391 00:21:27,556 --> 00:21:31,076 Speaker 2: we need to enable the next wave of breakthroughs. And 392 00:21:31,076 --> 00:21:33,316 Speaker 2: and this is kind of the race, is can you 393 00:21:33,316 --> 00:21:36,196 Speaker 2: get robots out into the to the real world, into 394 00:21:36,236 --> 00:21:39,796 Speaker 2: the field and collecting data very high you know, in 395 00:21:39,916 --> 00:21:43,236 Speaker 2: very high volumes. Whoever does that, you know, will we'll 396 00:21:43,236 --> 00:21:46,276 Speaker 2: have better models. And this is the data flywheel. So 397 00:21:46,636 --> 00:21:48,436 Speaker 2: this is kind of the race that's on right now 398 00:21:48,436 --> 00:21:51,356 Speaker 2: where you hear a lot of other humanoid CEOs talking 399 00:21:51,356 --> 00:21:53,516 Speaker 2: about getting a lot of robots out into the world. 400 00:21:54,116 --> 00:21:57,196 Speaker 2: Largely those are going to be under Teley operation collecting 401 00:21:57,276 --> 00:22:00,076 Speaker 2: data and then you know, training and building these models 402 00:22:00,116 --> 00:22:00,676 Speaker 2: of the future. 403 00:22:01,516 --> 00:22:05,916 Speaker 1: Ah So so it's like whoever gets there first will win, 404 00:22:06,076 --> 00:22:08,556 Speaker 1: just because that'll be the accelerant. Like once you have 405 00:22:08,716 --> 00:22:11,556 Speaker 1: robots in the world and you're collecting data, then you're 406 00:22:11,716 --> 00:22:14,516 Speaker 1: immediately getting ahead of whoever has fewer robots out in 407 00:22:14,516 --> 00:22:16,276 Speaker 1: the world because they're collecting less data. 408 00:22:16,356 --> 00:22:17,316 Speaker 2: Yeah, so. 409 00:22:18,796 --> 00:22:20,036 Speaker 1: Tell me about Telly operation. 410 00:22:21,796 --> 00:22:25,836 Speaker 2: Tellyoperation is basically just remotely controlling the robot, so you're 411 00:22:25,876 --> 00:22:28,516 Speaker 2: taking over the robot. You can see through the robot's 412 00:22:28,556 --> 00:22:32,476 Speaker 2: eyes with a VR headset, and then you're controlling the 413 00:22:32,556 --> 00:22:36,476 Speaker 2: robots arms and hands to do a particular task. It's 414 00:22:36,476 --> 00:22:39,036 Speaker 2: like a video game and you're controlling a robot. It's 415 00:22:39,076 --> 00:22:43,876 Speaker 2: the simple idea. There's a couple of reasons it's important. 416 00:22:44,276 --> 00:22:46,116 Speaker 2: The first thing is that it tells you what the 417 00:22:46,196 --> 00:22:49,876 Speaker 2: robot's physically capable of doing. So if I'm completely controlling 418 00:22:49,876 --> 00:22:52,836 Speaker 2: the robot and I can't do a task under Telly operation, 419 00:22:53,276 --> 00:22:55,836 Speaker 2: then that means the robot's not physically capable of doing 420 00:22:55,876 --> 00:22:58,996 Speaker 2: it to be very difficult for an AI control system 421 00:22:59,036 --> 00:23:01,196 Speaker 2: to control the robot to do that. So this is 422 00:23:01,236 --> 00:23:05,516 Speaker 2: how we understand the physical capabilities of the robot as 423 00:23:05,516 --> 00:23:08,916 Speaker 2: these new models have come along. The simple idea is 424 00:23:08,956 --> 00:23:10,996 Speaker 2: that if you can tell you operate the robot to 425 00:23:11,036 --> 00:23:13,916 Speaker 2: do a task, then you should be able to automate 426 00:23:13,956 --> 00:23:16,116 Speaker 2: that task on the other end. So if you can 427 00:23:16,116 --> 00:23:20,476 Speaker 2: collect enough data under Telly operation, then you can automate 428 00:23:20,476 --> 00:23:23,716 Speaker 2: it by running it through these similar architectures that we 429 00:23:23,796 --> 00:23:24,316 Speaker 2: talked about. 430 00:23:25,196 --> 00:23:29,236 Speaker 1: And so's the basic idea that like you use remote 431 00:23:29,236 --> 00:23:31,676 Speaker 1: control to like drive the robot to do a thing 432 00:23:32,036 --> 00:23:35,756 Speaker 1: whatever A thousand times some number of times, and in 433 00:23:35,836 --> 00:23:39,396 Speaker 1: doing that, you're training the robot, you're training the software 434 00:23:39,436 --> 00:23:39,956 Speaker 1: training the I. 435 00:23:40,476 --> 00:23:41,596 Speaker 2: Yeah, that's exactly right. 436 00:23:42,156 --> 00:23:44,756 Speaker 1: What's an example of a thing that you've done that way? 437 00:23:44,796 --> 00:23:46,276 Speaker 1: And like, you know, how many times did you have 438 00:23:46,316 --> 00:23:48,036 Speaker 1: to remote control it before the robot could do it? 439 00:23:48,356 --> 00:23:50,996 Speaker 2: So each is picking is a good example. Or you 440 00:23:51,036 --> 00:23:53,276 Speaker 2: know you're taking objects and you're putting them into a 441 00:23:53,356 --> 00:23:57,556 Speaker 2: box to do that in a simple context, thousands of 442 00:23:59,116 --> 00:24:02,916 Speaker 2: demonstrations is what you need, you know, and we think 443 00:24:02,956 --> 00:24:07,276 Speaker 2: of this as generally ours. So you know, how many 444 00:24:07,276 --> 00:24:11,596 Speaker 2: hours of data collection have we done, and thousands of 445 00:24:11,636 --> 00:24:14,796 Speaker 2: iterations can get you to let's say, eighty percent of 446 00:24:14,876 --> 00:24:17,356 Speaker 2: human rate. If you want to get to ninety five 447 00:24:17,396 --> 00:24:20,356 Speaker 2: percent or better of human rate, then you need more 448 00:24:20,356 --> 00:24:23,716 Speaker 2: and more data. But it's in the thousands, it's not millions. 449 00:24:24,156 --> 00:24:27,036 Speaker 1: Yeah, thousands makes it seem totally tractable. 450 00:24:27,316 --> 00:24:30,516 Speaker 2: Yeah, that was actually surprised by how well these models 451 00:24:31,356 --> 00:24:33,916 Speaker 2: work and actually how little data they need to get 452 00:24:33,996 --> 00:24:37,716 Speaker 2: relatively good performance. And you're seeing a lot of demonstrations 453 00:24:38,036 --> 00:24:40,076 Speaker 2: of this out there today. 454 00:24:40,076 --> 00:24:42,956 Speaker 1: And presumably that'll get better and better. Right as the 455 00:24:42,996 --> 00:24:47,156 Speaker 1: software side of AI gets better and better, it'll learn 456 00:24:47,196 --> 00:24:49,556 Speaker 1: faster essentially and the other obvious thing, but I'm just 457 00:24:49,636 --> 00:24:51,796 Speaker 1: going to say it is like, once you have done 458 00:24:51,836 --> 00:24:54,876 Speaker 1: it once, then it works for every robot. Then you 459 00:24:54,916 --> 00:24:56,796 Speaker 1: can make a million robots and they all know how 460 00:24:56,796 --> 00:24:57,676 Speaker 1: to do the thing, right. 461 00:24:57,796 --> 00:24:59,836 Speaker 2: Yeah, that's exactly right. And one of the interesting things 462 00:24:59,876 --> 00:25:02,956 Speaker 2: about these models is actually the diversity of data is 463 00:25:02,996 --> 00:25:06,516 Speaker 2: almost more important than task specific data. So you want 464 00:25:06,556 --> 00:25:09,596 Speaker 2: to go wide across a range of tasks, and then 465 00:25:09,836 --> 00:25:13,036 Speaker 2: you're basically building all these skills into the robot, and 466 00:25:13,076 --> 00:25:16,436 Speaker 2: then it becomes better at doing any one particular task. 467 00:25:17,236 --> 00:25:18,836 Speaker 1: It really is like learning. 468 00:25:19,236 --> 00:25:21,956 Speaker 2: It really is a human esque Yeah, that's right. 469 00:25:22,476 --> 00:25:25,236 Speaker 1: So I know you're in a few pilot projects with 470 00:25:25,876 --> 00:25:30,596 Speaker 1: Mercedes and what is GXO big logistics company. When do 471 00:25:30,636 --> 00:25:32,756 Speaker 1: you want to start selling robots for real? Like when 472 00:25:32,756 --> 00:25:33,836 Speaker 1: do you think that might happen? 473 00:25:33,956 --> 00:25:34,836 Speaker 2: Twenty twenty six? 474 00:25:35,716 --> 00:25:39,036 Speaker 1: Okay, yeah, suddenly that's next year. Almost. 475 00:25:39,076 --> 00:25:42,556 Speaker 2: Now, a year is a long time in you know, 476 00:25:42,916 --> 00:25:46,676 Speaker 2: these are dog years. It's a long time in this space. 477 00:25:48,436 --> 00:25:50,676 Speaker 1: And twenty twenty six could be almost two years. 478 00:25:50,796 --> 00:25:50,996 Speaker 2: Yeah. 479 00:25:51,316 --> 00:25:54,796 Speaker 1: Now, like who are you going to sell robots to and 480 00:25:54,836 --> 00:25:56,716 Speaker 1: how much you're going to charge and what are they 481 00:25:56,756 --> 00:25:56,996 Speaker 1: going to. 482 00:25:57,036 --> 00:26:02,316 Speaker 2: Do so the initially in manufacturing and logistics, so folks 483 00:26:02,316 --> 00:26:05,436 Speaker 2: like Mercedes and GXO, these are the initial customers of 484 00:26:05,476 --> 00:26:11,756 Speaker 2: these systems. We are not announcing pricing yet, but you 485 00:26:11,756 --> 00:26:15,276 Speaker 2: can think of it as you know, take what you 486 00:26:15,276 --> 00:26:17,476 Speaker 2: know it costs to do these tasks today and with 487 00:26:17,556 --> 00:26:21,076 Speaker 2: some discount to what it costs to do these tasks today. 488 00:26:21,916 --> 00:26:24,556 Speaker 2: We have a rass model that we that we use, 489 00:26:24,636 --> 00:26:28,236 Speaker 2: so you basically robot as a service. Yeah, robot as 490 00:26:28,276 --> 00:26:32,796 Speaker 2: a service model where you're you're paying the robot basically, 491 00:26:33,036 --> 00:26:36,996 Speaker 2: you know, by the hour effectively to do a particular task, 492 00:26:37,036 --> 00:26:38,916 Speaker 2: and that's at a discount to what it costs to 493 00:26:38,956 --> 00:26:39,876 Speaker 2: do that task today. 494 00:26:40,836 --> 00:26:43,116 Speaker 1: How far are you from the fifty thousand dollars robot. 495 00:26:43,756 --> 00:26:48,556 Speaker 2: We're not there yet, so not very far. So we 496 00:26:48,636 --> 00:26:50,636 Speaker 2: have the architecture to be able to do this. So 497 00:26:51,116 --> 00:26:52,996 Speaker 2: getting the cost down on these robots is a two 498 00:26:53,036 --> 00:26:58,396 Speaker 2: step process. So first step is new architectures. So if 499 00:26:58,396 --> 00:27:01,436 Speaker 2: you still require this very high precision in the system 500 00:27:01,476 --> 00:27:05,116 Speaker 2: and you're using bespoke components that are only used for robotics, 501 00:27:05,836 --> 00:27:09,516 Speaker 2: these robots will still be expensive. The challenge of umanoid 502 00:27:09,596 --> 00:27:12,916 Speaker 2: robots is they have a lot more motors than traditional robots. 503 00:27:13,036 --> 00:27:16,716 Speaker 2: So traditional robot has six or seven motors, a humanoid 504 00:27:16,796 --> 00:27:19,316 Speaker 2: robot has thirty to forty plus. 505 00:27:19,636 --> 00:27:22,916 Speaker 1: Okay, so that means it's expensive where you've got to 506 00:27:22,916 --> 00:27:25,316 Speaker 1: figure out how to get cheaper actuator. 507 00:27:25,396 --> 00:27:26,836 Speaker 2: Yeah, so we're there. So for us, that was a 508 00:27:26,836 --> 00:27:30,356 Speaker 2: five hundred dollars actuator that we and we have a 509 00:27:30,356 --> 00:27:33,236 Speaker 2: five hundred dollars actuator now today. And so once you 510 00:27:33,316 --> 00:27:36,436 Speaker 2: solve that problem, and once you solve the architecture problem, 511 00:27:36,476 --> 00:27:40,036 Speaker 2: now it's about scale and manufacturing. So a lot of 512 00:27:40,036 --> 00:27:41,836 Speaker 2: where we spend, a lot of where the cost is 513 00:27:43,196 --> 00:27:45,796 Speaker 2: driven at low volumes is in just the structures of 514 00:27:45,836 --> 00:27:49,356 Speaker 2: the robot or we're seeing seeing we're milling at of 515 00:27:49,396 --> 00:27:54,276 Speaker 2: big blocks of metal parts and very small quantities. But 516 00:27:54,876 --> 00:27:59,356 Speaker 2: there's other techniques that are are much more cost effective, 517 00:28:00,156 --> 00:28:04,036 Speaker 2: like casting or stamping, and these will allow these robots 518 00:28:04,116 --> 00:28:06,596 Speaker 2: to be much cheaper. As I mentioned, look at automotive 519 00:28:06,676 --> 00:28:09,596 Speaker 2: and look at the scale of automotive, there's four percent 520 00:28:09,636 --> 00:28:13,236 Speaker 2: the raw material by weight and a humanoid robot as 521 00:28:13,276 --> 00:28:16,076 Speaker 2: compared to a car. So as you are once you 522 00:28:16,116 --> 00:28:19,076 Speaker 2: solve the architecture problems such that you can build a 523 00:28:19,076 --> 00:28:21,916 Speaker 2: lot of these systems, and you are, they're simpler to 524 00:28:22,036 --> 00:28:26,036 Speaker 2: make than The next piece is just applying mass manufacturing 525 00:28:26,676 --> 00:28:29,636 Speaker 2: approaches to this to make them a lot cheaper. As 526 00:28:29,676 --> 00:28:31,516 Speaker 2: you skin, well, I mean. 527 00:28:33,156 --> 00:28:36,716 Speaker 1: That's a hard leap to make, right, Like what do 528 00:28:36,716 --> 00:28:38,556 Speaker 1: you do? You get a ton of capital and just 529 00:28:39,636 --> 00:28:42,356 Speaker 1: build a factory and hope there's demand on the other end, Like, 530 00:28:42,396 --> 00:28:45,596 Speaker 1: how do you go from this bespoke expensive thing to 531 00:28:45,716 --> 00:28:48,516 Speaker 1: a mass produced, you know, much less expensive thing. 532 00:28:48,516 --> 00:28:50,956 Speaker 2: Well, it's a gradient. So, like I said, step one 533 00:28:51,116 --> 00:28:54,316 Speaker 2: is you have new approaches that allow you to make 534 00:28:54,356 --> 00:28:57,476 Speaker 2: them cheaper, just inherently on a unit to unit basis. 535 00:28:57,516 --> 00:29:00,996 Speaker 2: So the early humanoids were like millions of dollars and 536 00:29:01,036 --> 00:29:03,836 Speaker 2: now we're in the hundreds of thousands of dollars range 537 00:29:03,876 --> 00:29:04,756 Speaker 2: for building one. 538 00:29:04,996 --> 00:29:06,876 Speaker 1: So you just got to get one more order of 539 00:29:06,956 --> 00:29:07,916 Speaker 1: magnitude out of it. 540 00:29:08,036 --> 00:29:10,796 Speaker 2: Yes, they've already We've drop the price by an order 541 00:29:10,836 --> 00:29:13,676 Speaker 2: of magnitude. And then now as we build more, and 542 00:29:13,756 --> 00:29:15,556 Speaker 2: even as you go as you add a zero, as 543 00:29:15,596 --> 00:29:18,636 Speaker 2: you go from ten to one hundred, the price drops 544 00:29:18,636 --> 00:29:21,876 Speaker 2: pretty dramatically. So you don't need the volume that you 545 00:29:22,556 --> 00:29:24,556 Speaker 2: that you might think, like my view is we can 546 00:29:24,556 --> 00:29:27,316 Speaker 2: get to the sub fifty thousand dollars price point in 547 00:29:27,396 --> 00:29:31,036 Speaker 2: the thousands of unit quantity, so without without hundreds of 548 00:29:31,076 --> 00:29:32,356 Speaker 2: thousands or millions of. 549 00:29:32,316 --> 00:29:36,236 Speaker 1: These So one big buyer, one one big car company 550 00:29:36,276 --> 00:29:40,156 Speaker 1: or logistics company might place an order of thousands of units. 551 00:29:39,876 --> 00:29:43,076 Speaker 2: Right, Yeah, And you made a comment that you said, 552 00:29:43,196 --> 00:29:45,636 Speaker 2: and you hope that there's demand. One of the things 553 00:29:45,636 --> 00:29:47,756 Speaker 2: that I think is important to note is the demand 554 00:29:47,916 --> 00:29:53,476 Speaker 2: for these robots is enormous. We have demand for hundreds 555 00:29:53,516 --> 00:29:56,916 Speaker 2: of thousands of units already today with the customers that 556 00:29:56,956 --> 00:30:00,996 Speaker 2: we're working with. So the demand is enormous. So we're 557 00:30:01,276 --> 00:30:03,436 Speaker 2: we're ramping up. You know, we've got to get the 558 00:30:03,556 --> 00:30:06,516 Speaker 2: robustness and the safety of the system and really bring 559 00:30:06,556 --> 00:30:09,716 Speaker 2: out the design. And you know, we're these are you know, 560 00:30:10,236 --> 00:30:13,956 Speaker 2: really credible, thoughtful people that are coming from other industries 561 00:30:13,996 --> 00:30:16,716 Speaker 2: that are now joining us, that now see that we've 562 00:30:16,796 --> 00:30:23,436 Speaker 2: crossed this threshold of technical viability and now taking lessons 563 00:30:23,436 --> 00:30:26,276 Speaker 2: from you know, how you scale and manufacture other things 564 00:30:26,276 --> 00:30:29,276 Speaker 2: and bringing that into the robotic space and the humanoid 565 00:30:29,316 --> 00:30:30,076 Speaker 2: space overall. 566 00:30:31,436 --> 00:30:33,756 Speaker 1: So in a year or at least next year, you 567 00:30:33,796 --> 00:30:37,196 Speaker 1: want to be selling robots for real? Where do you 568 00:30:37,196 --> 00:30:39,916 Speaker 1: want to be in five, say years. 569 00:30:41,116 --> 00:30:43,316 Speaker 2: My view is that where this evolves is it's going 570 00:30:43,396 --> 00:30:45,876 Speaker 2: to start in logistics and manufacturing, and then as we 571 00:30:45,916 --> 00:30:49,676 Speaker 2: solve safety as an industry. I'm really interested in healthcare 572 00:30:49,876 --> 00:30:53,756 Speaker 2: and particularly in elder care over time, so you know, 573 00:30:54,156 --> 00:30:57,436 Speaker 2: how can these robots be used to improve the way 574 00:30:57,436 --> 00:31:00,116 Speaker 2: we live and work. That was really the lens that 575 00:31:00,156 --> 00:31:04,036 Speaker 2: I came into this on, and so I think over 576 00:31:04,076 --> 00:31:06,316 Speaker 2: the next five years you'll start to see the early 577 00:31:06,396 --> 00:31:10,036 Speaker 2: stages of the next three years just to see early 578 00:31:10,156 --> 00:31:14,476 Speaker 2: applications for robots entering the home. There's some folks that 579 00:31:14,516 --> 00:31:17,196 Speaker 2: are really working hard on this. I think we're about 580 00:31:17,236 --> 00:31:21,036 Speaker 2: three years out from that being really viable, but I 581 00:31:21,076 --> 00:31:23,716 Speaker 2: hope people prove me wrong. I hope it's faster than that. 582 00:31:24,596 --> 00:31:29,516 Speaker 1: Three years seems fast. What's the sort of first first 583 00:31:30,236 --> 00:31:32,876 Speaker 1: use case, first job you imagine a robot doing for 584 00:31:32,956 --> 00:31:35,836 Speaker 1: real in somebody's house in three years. 585 00:31:35,596 --> 00:31:39,036 Speaker 2: Well, everybody wants laundry. If everybody I talked to says, 586 00:31:39,516 --> 00:31:41,236 Speaker 2: when is this thing going to do my laundry? And 587 00:31:41,276 --> 00:31:42,276 Speaker 2: I want that as well. 588 00:31:42,516 --> 00:31:45,556 Speaker 1: There's literally already a machine to do your laundry. All 589 00:31:45,556 --> 00:31:47,436 Speaker 1: you have to do is put it in one machine 590 00:31:47,436 --> 00:31:50,076 Speaker 1: and then put it in another The remaining work is trivial. 591 00:31:50,316 --> 00:31:53,996 Speaker 2: Yeah, I mean, look, I'm not the person to talk 592 00:31:53,996 --> 00:31:56,076 Speaker 2: about the home. I think we're still a ways out. 593 00:31:56,156 --> 00:31:59,356 Speaker 2: But there's humanoid companies like one X that are really 594 00:31:59,396 --> 00:32:02,396 Speaker 2: focused on on the home, and we've got a lot 595 00:32:02,436 --> 00:32:04,356 Speaker 2: of respect for what they're doing over there, and so 596 00:32:05,596 --> 00:32:08,196 Speaker 2: I hope they do it. I know that they're working 597 00:32:08,236 --> 00:32:12,676 Speaker 2: hard on it. And you know where I want a 598 00:32:12,756 --> 00:32:15,276 Speaker 2: robot for the home as well, So you a lot 599 00:32:15,316 --> 00:32:17,436 Speaker 2: of the things that are that are happening. And with 600 00:32:17,476 --> 00:32:20,076 Speaker 2: these models that I talked about, these more generic models, 601 00:32:20,316 --> 00:32:22,756 Speaker 2: the things that we're learning in the industrial base can 602 00:32:22,796 --> 00:32:25,716 Speaker 2: apply to the home over time as well. 603 00:32:25,956 --> 00:32:27,676 Speaker 1: In terms of the AI models, sure, I mean the 604 00:32:27,716 --> 00:32:30,196 Speaker 1: AA models are basically teaching a robot how to how 605 00:32:30,236 --> 00:32:32,196 Speaker 1: to deal with the physical world, that's right, how to 606 00:32:32,236 --> 00:32:34,716 Speaker 1: move around, how to pick things up, how to put 607 00:32:34,756 --> 00:32:35,276 Speaker 1: things down. 608 00:32:35,556 --> 00:32:37,316 Speaker 2: Yeah, the task and the I mean the home's tough 609 00:32:37,396 --> 00:32:39,556 Speaker 2: because like, how much is it even a robot does 610 00:32:39,596 --> 00:32:43,236 Speaker 2: your your dishes, your laundry, cleans and cooks for you. 611 00:32:44,316 --> 00:32:46,716 Speaker 2: How much are you willing to pay for that on 612 00:32:46,756 --> 00:32:47,716 Speaker 2: a yearly basis? 613 00:32:47,956 --> 00:32:50,996 Speaker 1: I'm imagining the first household tasks. I would have thought 614 00:32:50,996 --> 00:32:54,516 Speaker 1: you would have said, like people who are quadriplegic, right, Like, 615 00:32:54,556 --> 00:32:56,796 Speaker 1: there are a lot of people who have various kinds 616 00:32:56,796 --> 00:32:59,556 Speaker 1: of mobility problems who can't do very basic things around 617 00:32:59,556 --> 00:33:01,836 Speaker 1: the house, where essentially a robot could do it for them, 618 00:33:01,836 --> 00:33:03,716 Speaker 1: Like I would think that would be the first use case. 619 00:33:03,796 --> 00:33:06,036 Speaker 2: I think that's a great use case. And you know, 620 00:33:06,036 --> 00:33:07,876 Speaker 2: for me, that's that's sort of in the realm of 621 00:33:07,876 --> 00:33:10,196 Speaker 2: what I say is older care, which is like assistive 622 00:33:10,356 --> 00:33:13,676 Speaker 2: robots that help you with just base tasks, right, Like, 623 00:33:13,796 --> 00:33:16,236 Speaker 2: you know my granddad, one granddad went to a home 624 00:33:16,316 --> 00:33:19,116 Speaker 2: the other granddad had in home care, and the one 625 00:33:19,116 --> 00:33:21,836 Speaker 2: that had in home care. It was very simple things. 626 00:33:21,876 --> 00:33:24,876 Speaker 2: Remind you to take your medication and bring the medication over, 627 00:33:25,276 --> 00:33:27,916 Speaker 2: get you a glass of water, help you to get 628 00:33:27,996 --> 00:33:29,676 Speaker 2: up and out of bed, you know, to go to 629 00:33:29,716 --> 00:33:32,796 Speaker 2: the bathroom. Just help you stabilize to go to the bathroom. 630 00:33:34,116 --> 00:33:37,116 Speaker 2: And so that's not something that we're you know, we're 631 00:33:37,156 --> 00:33:41,916 Speaker 2: largely paying attention to industrial applications right now, but that 632 00:33:42,076 --> 00:33:44,876 Speaker 2: is the dream long term. So I'll be excited to 633 00:33:44,876 --> 00:33:45,756 Speaker 2: see how it shakes out. 634 00:33:46,396 --> 00:33:49,276 Speaker 1: Yeah, rationally, what you were saying is it makes sense, 635 00:33:49,356 --> 00:33:51,516 Speaker 1: Like I understand most people would rather stay at home. 636 00:33:51,556 --> 00:33:55,796 Speaker 1: I understand that in home care is like impossibly expensive 637 00:33:55,836 --> 00:33:59,196 Speaker 1: for most people. At the same time, like my emotional 638 00:33:59,236 --> 00:34:02,436 Speaker 1: response to a robot taken care of, say, my parents, 639 00:34:02,636 --> 00:34:05,756 Speaker 1: is it makes me feel sad. And I recognize that 640 00:34:05,756 --> 00:34:08,116 Speaker 1: that's perhaps irrational, but that is at some level my 641 00:34:08,156 --> 00:34:10,756 Speaker 1: emotional response. But you know, the happy thing is like 642 00:34:10,876 --> 00:34:13,876 Speaker 1: I should take care of them, but like that's hard 643 00:34:13,916 --> 00:34:15,596 Speaker 1: and it's probably not gonna happen, and it's for its 644 00:34:15,596 --> 00:34:18,196 Speaker 1: own set of reasons. Right, it's more more than we 645 00:34:18,236 --> 00:34:20,796 Speaker 1: bargained for in this conversation. I don't know, No, there 646 00:34:20,876 --> 00:34:23,396 Speaker 1: is something though a humanoid robot starts to get to 647 00:34:23,436 --> 00:34:25,556 Speaker 1: some weird places in that way. 648 00:34:25,476 --> 00:34:28,716 Speaker 2: Right, Yeah, I've thought a lot about this and and 649 00:34:28,756 --> 00:34:31,076 Speaker 2: I think it's a great place to go to happy 650 00:34:31,076 --> 00:34:33,436 Speaker 2: to talk about it. I think what we want is 651 00:34:33,436 --> 00:34:36,396 Speaker 2: we want humans taking care of other humans. That's what 652 00:34:36,436 --> 00:34:39,676 Speaker 2: we want, right, But we don't have that today where 653 00:34:39,876 --> 00:34:42,196 Speaker 2: you know, look at the way that we age. You know, 654 00:34:42,516 --> 00:34:44,756 Speaker 2: for me, you know, I was very close to both 655 00:34:44,756 --> 00:34:46,956 Speaker 2: of my granddads. They both lived into their nineties and 656 00:34:46,996 --> 00:34:52,836 Speaker 2: outlived my grandmother's oddly enough, and so I sort of 657 00:34:52,876 --> 00:34:55,596 Speaker 2: watched them age through their lens and that was a 658 00:34:55,596 --> 00:34:58,916 Speaker 2: big driver of doing this. And you know, these are 659 00:34:58,996 --> 00:35:02,956 Speaker 2: people that both of them were war heroes, they contributed society, 660 00:35:03,036 --> 00:35:05,796 Speaker 2: they did all these amazing things, and then at the 661 00:35:05,916 --> 00:35:07,636 Speaker 2: end of their life, they felt like they were a 662 00:35:07,636 --> 00:35:11,796 Speaker 2: burden to their family, and they had this feeling like 663 00:35:11,836 --> 00:35:14,236 Speaker 2: they never had to rely on anyone for anything, and 664 00:35:14,276 --> 00:35:20,236 Speaker 2: now they're completely reliant on people for everything. And what 665 00:35:20,316 --> 00:35:23,236 Speaker 2: I saw them do as they age was they lost 666 00:35:23,236 --> 00:35:26,676 Speaker 2: their dignity. And for me, this idea that you could 667 00:35:26,716 --> 00:35:31,196 Speaker 2: have a machine that carries your secrets, that is, your 668 00:35:31,276 --> 00:35:35,396 Speaker 2: machine that does things for you, allows you to keep 669 00:35:35,436 --> 00:35:39,716 Speaker 2: your dignity such that then you as a human that's aging, 670 00:35:40,076 --> 00:35:42,476 Speaker 2: you're fresher. You don't have to rely on your son 671 00:35:42,596 --> 00:35:46,236 Speaker 2: or your daughter or your spouse to get you a 672 00:35:46,236 --> 00:35:49,316 Speaker 2: glass of water or to do things for you. You 673 00:35:49,396 --> 00:35:52,236 Speaker 2: still have your own agency and your own autonomy through 674 00:35:52,236 --> 00:35:55,436 Speaker 2: a machine, and that then helps your family to be 675 00:35:55,556 --> 00:35:58,236 Speaker 2: much fresher because they don't have the burden of having 676 00:35:58,276 --> 00:36:01,276 Speaker 2: to do all these things to support you, where then 677 00:36:01,316 --> 00:36:03,556 Speaker 2: they can be fresher. And so my hope is that 678 00:36:04,396 --> 00:36:07,076 Speaker 2: this is not designed to replace what humans do for 679 00:36:07,116 --> 00:36:10,996 Speaker 2: each other. This is designed to augm and enhance that. Remember, 680 00:36:11,116 --> 00:36:15,116 Speaker 2: just like my granddad as he was getting older, you know, 681 00:36:15,636 --> 00:36:17,916 Speaker 2: I was working and I was busy, and you know, 682 00:36:18,076 --> 00:36:19,916 Speaker 2: I would try to go over as many days as 683 00:36:19,956 --> 00:36:21,956 Speaker 2: I could, but it was always really tough and I 684 00:36:21,956 --> 00:36:23,956 Speaker 2: didn't want to be alone, and it was this whole 685 00:36:24,356 --> 00:36:27,796 Speaker 2: battle that I think everybody goes through. And my hope 686 00:36:27,796 --> 00:36:30,316 Speaker 2: for the future, actually think it's a much more optimistic 687 00:36:30,476 --> 00:36:33,516 Speaker 2: version is that hopefully my parents have a robot, and 688 00:36:33,556 --> 00:36:38,356 Speaker 2: that robot is basically programmed for their happiness, and it's 689 00:36:38,396 --> 00:36:41,396 Speaker 2: designed to remind them when they're down of their favorite 690 00:36:41,436 --> 00:36:45,156 Speaker 2: song and play it right, remind them that of the 691 00:36:45,636 --> 00:36:48,356 Speaker 2: movies that they like watching, or whatever, it might be right. 692 00:36:48,476 --> 00:36:52,556 Speaker 2: And I think that's more optimistic. I think that's exciting, 693 00:36:52,596 --> 00:36:55,596 Speaker 2: and that makes me hopeful about the future. And you know, 694 00:36:55,596 --> 00:36:57,836 Speaker 2: I think that's the worst part of the human experience 695 00:36:57,916 --> 00:36:59,476 Speaker 2: is the way that we age. And I think that 696 00:36:59,516 --> 00:37:03,316 Speaker 2: these robots and AI embodied AI and AI in general 697 00:37:04,316 --> 00:37:06,516 Speaker 2: can hopefully allow us to take better care of each other. 698 00:37:06,796 --> 00:37:11,676 Speaker 2: So I don't think it is creepy. I think it 699 00:37:11,716 --> 00:37:15,516 Speaker 2: can actually be pretty beautiful if properly done. And that's 700 00:37:15,516 --> 00:37:19,036 Speaker 2: what you asked at the beginning. How is eptronic different 701 00:37:19,436 --> 00:37:22,236 Speaker 2: and what are we focused on? You know? For me, 702 00:37:22,676 --> 00:37:26,036 Speaker 2: I say human centered robotics. But what that means is 703 00:37:26,076 --> 00:37:30,436 Speaker 2: that we want this to be an optimistic future for humanity. 704 00:37:30,996 --> 00:37:33,636 Speaker 2: We are tool makers, and we want to build tools 705 00:37:33,676 --> 00:37:38,276 Speaker 2: for humans to enable us to live in better ways. 706 00:37:38,916 --> 00:37:41,796 Speaker 2: And I think that if we really focus on that, 707 00:37:41,836 --> 00:37:44,156 Speaker 2: I think that there can be really amazing ways of 708 00:37:44,236 --> 00:37:47,276 Speaker 2: doing this, and I think elder care is a great 709 00:37:47,316 --> 00:37:49,356 Speaker 2: example of how this can be used in that way. 710 00:37:53,956 --> 00:37:56,076 Speaker 1: We'll be back in a minute with the Lightning Round. 711 00:38:05,516 --> 00:38:08,796 Speaker 1: And now we are back, as promised with the Lightning Round. 712 00:38:09,636 --> 00:38:13,356 Speaker 1: What's the biggest difference between Austin today and Austin ten 713 00:38:13,436 --> 00:38:13,956 Speaker 1: years ago? 714 00:38:14,996 --> 00:38:18,796 Speaker 2: Oh man, it's changed quite a bit. So. One of 715 00:38:18,836 --> 00:38:22,796 Speaker 2: the things that made Austin, you know, really a great 716 00:38:22,836 --> 00:38:25,436 Speaker 2: place to live and work was just how small it 717 00:38:25,516 --> 00:38:28,316 Speaker 2: was and how accessible everybody was. You know, we used 718 00:38:28,316 --> 00:38:31,636 Speaker 2: to have these house parties and somebody would bring a violin, 719 00:38:31,876 --> 00:38:35,756 Speaker 2: and someone would bring a sitar and these world instruments, 720 00:38:35,796 --> 00:38:38,796 Speaker 2: and you'd have just all sorts of eclectic, creative people 721 00:38:38,876 --> 00:38:42,876 Speaker 2: doing really interesting things. And so I think, you know, 722 00:38:42,956 --> 00:38:46,516 Speaker 2: one of the things that I am worried about is 723 00:38:47,476 --> 00:38:50,396 Speaker 2: that was kind of what made Austin special and the 724 00:38:50,436 --> 00:38:54,476 Speaker 2: things that make you special. People want to kind of commercialize, right, 725 00:38:54,516 --> 00:38:56,036 Speaker 2: and they want to they want to take this and 726 00:38:56,076 --> 00:38:59,076 Speaker 2: they want to sort of scale it, and and it's 727 00:38:59,076 --> 00:39:02,276 Speaker 2: almost special because it's not commercialized. It's just this raw, 728 00:39:02,476 --> 00:39:05,396 Speaker 2: organic thing. And and so how does that as more 729 00:39:05,476 --> 00:39:08,236 Speaker 2: tech and more money comes into Austin, you know, how 730 00:39:08,236 --> 00:39:12,236 Speaker 2: does that? How does what make Austin, what made Austin great? 731 00:39:12,876 --> 00:39:17,396 Speaker 2: How does that continue to evolve? So I think though 732 00:39:17,436 --> 00:39:19,476 Speaker 2: I welcome it, you know, I'd rather be in the 733 00:39:19,516 --> 00:39:23,436 Speaker 2: place where everybody's coming and everyone wants to build the future. 734 00:39:23,516 --> 00:39:26,836 Speaker 2: So I'm not one of those that is sort of 735 00:39:26,876 --> 00:39:29,516 Speaker 2: resisting the changes. I think it's really exciting, and I 736 00:39:29,516 --> 00:39:33,036 Speaker 2: think more people with new ideas about the future in 737 00:39:33,076 --> 00:39:35,876 Speaker 2: the world and kind of a free place to do it. 738 00:39:36,356 --> 00:39:38,956 Speaker 2: There's a you know, there's a real ability here in 739 00:39:38,996 --> 00:39:42,996 Speaker 2: Texas and in Austin to kind of do what you want, 740 00:39:43,076 --> 00:39:46,516 Speaker 2: and there's a real culture around you know, the freedom 741 00:39:46,596 --> 00:39:48,876 Speaker 2: to do the things that you want to do. And 742 00:39:48,996 --> 00:39:51,916 Speaker 2: so it's a kind of a unique place where all 743 00:39:51,916 --> 00:39:54,956 Speaker 2: that's coming to the creativity and the you know, the 744 00:39:55,796 --> 00:40:00,196 Speaker 2: capitalism and all that's all coming together. Is Austin still weird? 745 00:40:00,916 --> 00:40:04,236 Speaker 2: Still there's pockets of weird. Yeah, certainly there's still weird. 746 00:40:04,236 --> 00:40:07,596 Speaker 2: Austin is still there. It's all growing up. But yeah, certainly. 747 00:40:10,156 --> 00:40:15,876 Speaker 1: What's your favorite humanoid robot in fiction, in books and movies? C? 748 00:40:16,076 --> 00:40:17,076 Speaker 2: Three po for sure. 749 00:40:18,036 --> 00:40:20,436 Speaker 1: Okay, you were ready with that. You have that one 750 00:40:20,556 --> 00:40:21,036 Speaker 1: on deck. 751 00:40:21,276 --> 00:40:23,396 Speaker 2: Yeah, well for three point it makes C three po 752 00:40:23,436 --> 00:40:24,556 Speaker 2: the human helper. Right. 753 00:40:25,476 --> 00:40:29,156 Speaker 1: What's one thing that you've learned about the human body 754 00:40:29,196 --> 00:40:30,716 Speaker 1: from building robots? 755 00:40:31,836 --> 00:40:35,156 Speaker 2: Oh? Man, At a high level, what I've learned is 756 00:40:35,156 --> 00:40:39,116 Speaker 2: how amazing the human body really is. I think there's 757 00:40:39,116 --> 00:40:43,156 Speaker 2: this fear from humans that as we sort of continue 758 00:40:43,196 --> 00:40:48,076 Speaker 2: down this pursuit of replicating humans and building machines that 759 00:40:48,156 --> 00:40:51,236 Speaker 2: can do what humans do, that that diminishes what it 760 00:40:51,276 --> 00:40:54,476 Speaker 2: means to be humans. But what it's actually done for 761 00:40:54,596 --> 00:40:56,676 Speaker 2: me and most of the people working on this is 762 00:40:56,716 --> 00:41:00,676 Speaker 2: it just makes you appreciate even more how amazing humans are. 763 00:41:01,516 --> 00:41:04,716 Speaker 2: So the hand is something that you think a lot about. 764 00:41:04,876 --> 00:41:07,116 Speaker 2: You just do all these things and you don't appreciate 765 00:41:07,436 --> 00:41:11,996 Speaker 2: how incredible your hands are are. And you just when 766 00:41:12,036 --> 00:41:14,716 Speaker 2: you when you start to try to build a hand 767 00:41:15,156 --> 00:41:19,356 Speaker 2: for a robot, you just appreciate all the limitations how 768 00:41:19,396 --> 00:41:22,756 Speaker 2: we walk, how we move, the fact. 769 00:41:22,476 --> 00:41:25,116 Speaker 1: That we can go hard right, all the things we 770 00:41:25,156 --> 00:41:27,596 Speaker 1: do just like pick up an egg or open a 771 00:41:27,676 --> 00:41:30,316 Speaker 1: door like that's wildly difficult. 772 00:41:29,836 --> 00:41:32,956 Speaker 2: It's amazing. Or you eat that egg and it powers 773 00:41:33,036 --> 00:41:36,396 Speaker 2: you for a day. It powers this neural network brain 774 00:41:36,516 --> 00:41:39,316 Speaker 2: that's you know, billions of parameters, right. I mean, the 775 00:41:39,396 --> 00:41:43,956 Speaker 2: human human humans are amazing. And I think as we 776 00:41:44,356 --> 00:41:47,476 Speaker 2: continue to learn more about what it means to be human, 777 00:41:47,476 --> 00:41:49,996 Speaker 2: what does it mean to be conscious, all these kind 778 00:41:50,036 --> 00:41:52,476 Speaker 2: of big ideas I think will only grow to appreciate 779 00:41:52,556 --> 00:41:53,676 Speaker 2: what we actually have here. 780 00:41:55,556 --> 00:41:58,036 Speaker 1: Last one, tell me about your grandfather. 781 00:41:59,796 --> 00:42:06,396 Speaker 2: Oh man, So you know two grandfathers, one Gilberto Cardinas, 782 00:42:07,356 --> 00:42:12,756 Speaker 2: the other one George Smith. Both of them were great. 783 00:42:13,276 --> 00:42:17,516 Speaker 2: My granddad, Gilberto Cardinas, came from Puerto Rico. When he 784 00:42:17,636 --> 00:42:20,796 Speaker 2: was seventeen, he joined the army and fought in the 785 00:42:20,876 --> 00:42:25,036 Speaker 2: Korean War. He spoke five languages. He was self educated, 786 00:42:26,516 --> 00:42:29,596 Speaker 2: and you know, he had the American dream and dreamed 787 00:42:29,596 --> 00:42:32,116 Speaker 2: of what he could do. He was in the army, 788 00:42:32,156 --> 00:42:35,156 Speaker 2: he was actually a field medic but became a hospital administrator, 789 00:42:35,836 --> 00:42:40,996 Speaker 2: and he's a big sort of driving force in our family. 790 00:42:41,076 --> 00:42:44,836 Speaker 2: And I watched him age and watched all the things 791 00:42:44,916 --> 00:42:47,636 Speaker 2: he went through. He actually fell and lost his vision, 792 00:42:47,796 --> 00:42:52,076 Speaker 2: so his brain was still intact in his body largely, 793 00:42:52,116 --> 00:42:54,156 Speaker 2: but he couldn't see, and so he had to have 794 00:42:54,196 --> 00:42:56,636 Speaker 2: around the clock care when he was in his nineties 795 00:42:57,076 --> 00:43:01,556 Speaker 2: in the home. And he wasn't wealthy by any stretch, 796 00:43:01,636 --> 00:43:05,236 Speaker 2: but he'd done okay and had saved his money, and 797 00:43:05,356 --> 00:43:08,596 Speaker 2: it was seventeen thousand dollars a month for in home care. 798 00:43:09,276 --> 00:43:13,516 Speaker 2: And it was like this revolving door of people that 799 00:43:14,196 --> 00:43:17,076 Speaker 2: would rather be doing anything else than sitting in a 800 00:43:17,156 --> 00:43:20,236 Speaker 2: room with my granddad and taking care of him. And so, 801 00:43:20,716 --> 00:43:24,996 Speaker 2: you know, for me, I admired my granddad so much, 802 00:43:25,196 --> 00:43:27,356 Speaker 2: and just seeing sort of that as the end of 803 00:43:27,396 --> 00:43:32,316 Speaker 2: his life, sitting in a room, you know, counting the 804 00:43:32,396 --> 00:43:34,196 Speaker 2: days down, I just thought there's got to be a 805 00:43:34,236 --> 00:43:37,236 Speaker 2: better way than this, And that was a big, a 806 00:43:37,236 --> 00:43:40,236 Speaker 2: big driver for me doing all this. My other granddad 807 00:43:40,236 --> 00:43:44,556 Speaker 2: actually ended up getting George Smith went to he had 808 00:43:44,596 --> 00:43:47,516 Speaker 2: to go to a home. He had colon cancer and 809 00:43:47,796 --> 00:43:50,316 Speaker 2: his brain still functioned. He had a great sense of 810 00:43:50,396 --> 00:43:54,716 Speaker 2: humor and he lost control of his bows. And so 811 00:43:54,796 --> 00:43:58,956 Speaker 2: you can imagine how humiliating that is as you age 812 00:43:59,756 --> 00:44:02,236 Speaker 2: to be fully aware of what's going on, never rely 813 00:44:02,316 --> 00:44:05,436 Speaker 2: on anybody, but not be able to control your bows. 814 00:44:05,956 --> 00:44:08,076 Speaker 2: And so he had to get multiple showers a day, 815 00:44:08,236 --> 00:44:11,156 Speaker 2: and every time I would go see him, you know, 816 00:44:11,356 --> 00:44:15,036 Speaker 2: just it was a humiliating experience. And so these are 817 00:44:15,116 --> 00:44:17,996 Speaker 2: things that are just my story, but everybody has their 818 00:44:18,036 --> 00:44:21,516 Speaker 2: own story of you know, taking care of an aging 819 00:44:21,596 --> 00:44:26,396 Speaker 2: parent or grandparent and just what that looks like. And 820 00:44:26,396 --> 00:44:29,156 Speaker 2: and my hope is that as humans, as tool makers, 821 00:44:29,756 --> 00:44:31,716 Speaker 2: I think we can do better than that. And I 822 00:44:31,756 --> 00:44:36,076 Speaker 2: think that these machines we can create will allow us 823 00:44:36,076 --> 00:44:38,796 Speaker 2: to take better care of each other. And my parents 824 00:44:38,876 --> 00:44:43,596 Speaker 2: are already naming their robots. They're there there, they can't 825 00:44:43,596 --> 00:44:45,596 Speaker 2: wait to get them. They're like, you know, they're they're 826 00:44:45,596 --> 00:44:48,676 Speaker 2: almost seventy now, and they're like, you got to have 827 00:44:48,716 --> 00:44:51,476 Speaker 2: these ready for whenever, whenever our time is there, so 828 00:44:51,516 --> 00:44:54,676 Speaker 2: that you know, so that we can age more gracefully 829 00:44:54,716 --> 00:44:55,916 Speaker 2: than our parents did. 830 00:45:02,276 --> 00:45:05,916 Speaker 1: Jeff Gardinas is the co founder and CEO of Actronic. 831 00:45:06,836 --> 00:45:10,676 Speaker 1: Today's show was produced by Gabriel Hunter. It was edited 832 00:45:10,676 --> 00:45:14,356 Speaker 1: by Lydia Jeane Kott and engineered by Sarah Bruguer. You 833 00:45:14,396 --> 00:45:18,156 Speaker 1: can email us at problem at Pushkin dot Fm. I'm 834 00:45:18,236 --> 00:45:20,676 Speaker 1: Jacob Goldstein, and we'll be back next week with another 835 00:45:20,716 --> 00:45:33,276 Speaker 1: episode of What's Your Problem.