1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,436 --> 00:00:26,436 Speaker 2: AI works amazingly well. It works terrifyingly well even for 3 00:00:26,836 --> 00:00:32,036 Speaker 2: virtual things, for words, for pictures, for videos. This is 4 00:00:32,036 --> 00:00:36,476 Speaker 2: true in large part because of the Internet. The Internet 5 00:00:36,516 --> 00:00:41,876 Speaker 2: provides this wildly abundant, readily available source of words, pictures, 6 00:00:41,916 --> 00:00:47,876 Speaker 2: and videos to train AI models. But there is no analogous, 7 00:00:48,356 --> 00:00:52,356 Speaker 2: wildly abundant, readily available data set for the physical world. 8 00:00:53,116 --> 00:00:57,956 Speaker 2: There is no gargantuan Internet like repository of data that 9 00:00:58,116 --> 00:01:02,836 Speaker 2: describes how things move and bend and break in real 10 00:01:02,956 --> 00:01:06,876 Speaker 2: physical space. And as a result, we do not yet 11 00:01:07,036 --> 00:01:11,996 Speaker 2: have robust AI for the physical But people are working 12 00:01:12,036 --> 00:01:16,116 Speaker 2: on it, and if they succeed, they'll change the way 13 00:01:16,196 --> 00:01:19,596 Speaker 2: the world works, not just the world as it appears 14 00:01:19,596 --> 00:01:23,516 Speaker 2: on our screens, but the actual physical world, the world 15 00:01:23,516 --> 00:01:32,876 Speaker 2: where if you drop something on your foot it hurts. 16 00:01:33,276 --> 00:01:35,636 Speaker 2: I'm Jacob Goldstein and this is What's Your Problem, the 17 00:01:35,716 --> 00:01:37,956 Speaker 2: show where I talk to people who are trying to 18 00:01:38,036 --> 00:01:43,036 Speaker 2: make technological progress. My guest today is Edward Mayer. He's 19 00:01:43,076 --> 00:01:47,436 Speaker 2: the co founder and CEO of Machina Labs. Edward's problem 20 00:01:47,516 --> 00:01:51,596 Speaker 2: is this, how can you use AI to turn robots 21 00:01:51,996 --> 00:01:58,316 Speaker 2: from dumb, inflexible machines into skilled versatile craftsmen. Before he 22 00:01:58,396 --> 00:02:02,396 Speaker 2: started Machina Labs, Edward worked in the rocket ship business, 23 00:02:02,676 --> 00:02:06,516 Speaker 2: first at SpaceX and then at a company called Relativity Space. 24 00:02:07,076 --> 00:02:10,636 Speaker 2: And in the rocket business, the word solw firsthand the 25 00:02:10,756 --> 00:02:14,676 Speaker 2: problems of traditional manufacturing. It's the kind of problem he's 26 00:02:14,716 --> 00:02:17,956 Speaker 2: now trying to solve with AI and robots. It's a 27 00:02:17,956 --> 00:02:24,916 Speaker 2: problem called the rigid factory problem. So I've heard you 28 00:02:24,996 --> 00:02:27,676 Speaker 2: use this phrase that's interesting to me, and it's the 29 00:02:27,796 --> 00:02:32,756 Speaker 2: rigid factory problem. What's the rigid factory problem? 30 00:02:32,996 --> 00:02:36,236 Speaker 1: That main problem with the factories today is that rigidity, 31 00:02:36,316 --> 00:02:39,156 Speaker 1: meaning that if you have to build a physical product, 32 00:02:40,276 --> 00:02:42,916 Speaker 1: you pretty much have to build a factory that's designed 33 00:02:42,956 --> 00:02:45,556 Speaker 1: for it and built for it. There's a lot of 34 00:02:45,596 --> 00:02:49,436 Speaker 1: components that goes into the factory, from machinery all the 35 00:02:49,436 --> 00:02:52,396 Speaker 1: way to the tooling that is required to build products 36 00:02:52,436 --> 00:02:55,916 Speaker 1: that are specifically designed for the geometry for the material 37 00:02:56,036 --> 00:02:58,196 Speaker 1: that you're trying to use. The moment you want to 38 00:02:58,276 --> 00:03:00,836 Speaker 1: change that, you have to change your factory, which is 39 00:03:00,876 --> 00:03:03,516 Speaker 1: a huge investment. You know, I always give an example 40 00:03:03,596 --> 00:03:06,516 Speaker 1: from when I was at SpaceX. You know, you think 41 00:03:06,556 --> 00:03:10,476 Speaker 1: of SpaceX as a very innovative and it is you 42 00:03:10,476 --> 00:03:12,516 Speaker 1: know on the edge of a hardware space in terms 43 00:03:12,516 --> 00:03:17,076 Speaker 1: of innovation in the past twenty four years twenty three 44 00:03:17,156 --> 00:03:20,876 Speaker 1: four years that they have existed, they have two rocket families. 45 00:03:21,236 --> 00:03:24,356 Speaker 1: There's Starship and there's Falcon, right, because at the moment 46 00:03:24,356 --> 00:03:27,236 Speaker 1: you decide on diameter of for example, Falcon nine or 47 00:03:27,276 --> 00:03:29,916 Speaker 1: the Falcon family in general, the diameter of that core, 48 00:03:29,956 --> 00:03:31,756 Speaker 1: it's very hard to change it. A lot of tooling 49 00:03:31,836 --> 00:03:35,516 Speaker 1: and machinery specifically built for that diameter. And that's why 50 00:03:35,516 --> 00:03:37,236 Speaker 1: for Starship they had to start from scratch. 51 00:03:37,636 --> 00:03:40,476 Speaker 2: Start from scratch, meaning like not just design, but like 52 00:03:40,836 --> 00:03:43,476 Speaker 2: the factory itself, like the Factorily had to build a 53 00:03:43,476 --> 00:03:46,436 Speaker 2: whole new factory because they wanted to make a different 54 00:03:46,516 --> 00:03:47,396 Speaker 2: sized rocket. 55 00:03:47,876 --> 00:03:51,076 Speaker 1: Yes, different size, different material, All the tooling has to change, 56 00:03:51,156 --> 00:03:54,876 Speaker 1: right almost almost, Yeah, you have to basically assume building 57 00:03:54,916 --> 00:03:57,516 Speaker 1: from scratch, ground up factory. Why does it need to 58 00:03:57,556 --> 00:03:59,236 Speaker 1: be there for us to build this new product? 59 00:04:00,316 --> 00:04:03,396 Speaker 2: I heard you describe was this from your own experience 60 00:04:03,836 --> 00:04:08,876 Speaker 2: the sort of era at SpaceX when the fact that 61 00:04:08,916 --> 00:04:12,316 Speaker 2: you couldn't make the rocket wider led to all these 62 00:04:12,436 --> 00:04:16,036 Speaker 2: kind of difficult things people were trying to do to 63 00:04:16,036 --> 00:04:18,116 Speaker 2: be like, how can we do all these things under 64 00:04:18,156 --> 00:04:20,716 Speaker 2: this fundamental constraint, Like, can you talk a little bit 65 00:04:20,756 --> 00:04:21,196 Speaker 2: about that. 66 00:04:21,596 --> 00:04:25,636 Speaker 1: Yeah, this is a lot of conversation happening in twenty twelve, 67 00:04:25,716 --> 00:04:30,996 Speaker 1: twenty thirteen, twenty fourteen time when the diameter of the 68 00:04:30,996 --> 00:04:33,676 Speaker 1: Falcon nine could not get any larger, and if you 69 00:04:33,716 --> 00:04:36,356 Speaker 1: look at actually different Falcon versions, the height of that 70 00:04:36,436 --> 00:04:40,036 Speaker 1: vehicle kept going higher, the diameter could not change. So 71 00:04:40,076 --> 00:04:42,796 Speaker 1: it was about where what space can you find to 72 00:04:42,876 --> 00:04:49,116 Speaker 1: put new features and new designs that exist within the vehicle. 73 00:04:49,196 --> 00:04:51,156 Speaker 1: So there was a lot of stuff basically being crammed 74 00:04:51,196 --> 00:04:52,916 Speaker 1: into the space that you have already got. 75 00:04:53,076 --> 00:04:55,556 Speaker 2: So that's true for building rockets. I mean, what are 76 00:04:55,596 --> 00:04:59,836 Speaker 2: some other just you know, different kinds of manufactured products 77 00:04:59,836 --> 00:05:02,036 Speaker 2: where that kind of rigidity is a problem. 78 00:05:02,396 --> 00:05:06,436 Speaker 1: Yeah, I think it is just common almost in all manufacturing. 79 00:05:06,476 --> 00:05:08,836 Speaker 1: That's why this phenomenon. I think it's kind of funny. 80 00:05:08,836 --> 00:05:12,716 Speaker 1: People take it for granted that a thing called economies 81 00:05:12,716 --> 00:05:16,236 Speaker 1: of scale, uh huh, Like people take it for granted 82 00:05:16,276 --> 00:05:18,716 Speaker 1: as if it's rule of nature. It's actually not. 83 00:05:19,236 --> 00:05:21,556 Speaker 2: Just to be clear, it's basically, the more you build 84 00:05:21,556 --> 00:05:23,796 Speaker 2: of a thing, the cheaper each one of those things gets. 85 00:05:23,796 --> 00:05:25,876 Speaker 2: If you build one, it's really expensive. If you build 86 00:05:25,876 --> 00:05:27,236 Speaker 2: a million each one's. 87 00:05:27,116 --> 00:05:29,716 Speaker 1: A lot cheaper, yes, exactly, But then people don't think 88 00:05:29,756 --> 00:05:32,036 Speaker 1: about it's like, oh, okay, intuitively makes sense, but why 89 00:05:32,196 --> 00:05:36,756 Speaker 1: it's actually not that intuitive. It's actually a limitation of technology. Right. 90 00:05:38,836 --> 00:05:41,636 Speaker 1: Why why economies of scale is a thing is because 91 00:05:41,636 --> 00:05:43,796 Speaker 1: you have to make a huge amount of investment to 92 00:05:43,876 --> 00:05:46,156 Speaker 1: make the first thing, and the moment you make the 93 00:05:46,196 --> 00:05:48,436 Speaker 1: second thing and the third thing, then you can break 94 00:05:48,516 --> 00:05:51,876 Speaker 1: even your investment onto more products that you're going to 95 00:05:52,116 --> 00:05:54,276 Speaker 1: come out of it. But that's only true if the 96 00:05:54,316 --> 00:05:57,796 Speaker 1: second product can be built for the first investment. You 97 00:05:57,876 --> 00:05:59,836 Speaker 1: had to turn this concept and say, oh, this is 98 00:05:59,876 --> 00:06:02,076 Speaker 1: a given. This is an axiom of the world that 99 00:06:02,156 --> 00:06:04,676 Speaker 1: economies of scale is a thing, but in reality it 100 00:06:04,716 --> 00:06:08,196 Speaker 1: is a technological challenge. Right. It means that you build 101 00:06:08,196 --> 00:06:10,676 Speaker 1: a car, you ask what application you Once you build 102 00:06:10,676 --> 00:06:14,116 Speaker 1: a factory for a car and all the toolings and 103 00:06:14,196 --> 00:06:17,276 Speaker 1: dies that goes into stamping of the panels of that car, 104 00:06:17,716 --> 00:06:20,596 Speaker 1: that's one hundred and fifty million dollar investment just for stamping. 105 00:06:21,276 --> 00:06:23,116 Speaker 1: And this is a number for example from Tesla. Tesla 106 00:06:23,156 --> 00:06:25,276 Speaker 1: spens one hundred and fifty million dollars in a stamping 107 00:06:25,316 --> 00:06:29,756 Speaker 1: plant they have in Giga factory in Texas, right, and 108 00:06:29,836 --> 00:06:33,076 Speaker 1: that can only make Model Y or Model three right 109 00:06:33,196 --> 00:06:35,956 Speaker 1: the moment you have to change that. That means go 110 00:06:36,036 --> 00:06:40,156 Speaker 1: through every eighty two hundred and thirty sheet metal panels 111 00:06:40,156 --> 00:06:42,436 Speaker 1: that exist on that car and design a new tool 112 00:06:42,476 --> 00:06:44,756 Speaker 1: for it. And each of these tool is going to 113 00:06:44,836 --> 00:06:47,156 Speaker 1: be a few hundred thousand dollars to sometimes a million 114 00:06:47,196 --> 00:06:49,396 Speaker 1: dollars or a million and a half million dollars. And 115 00:06:49,436 --> 00:06:51,756 Speaker 1: you're talking about like eighty t one hundred and thirty 116 00:06:51,796 --> 00:06:52,676 Speaker 1: tools per vehicle. 117 00:06:53,036 --> 00:06:55,956 Speaker 2: And like all you're doing you're not reinventing the car there, 118 00:06:55,996 --> 00:06:58,596 Speaker 2: you're just making a car that's a slightly different shape. 119 00:06:58,716 --> 00:07:01,756 Speaker 1: Basically, yes, maybe you may get sedan, and you're not 120 00:07:02,076 --> 00:07:04,916 Speaker 1: trying to do a slightly longer version, a slightly bigger version. 121 00:07:06,076 --> 00:07:08,476 Speaker 1: And that's why economies a scale of the thing you saying, Okay, 122 00:07:08,516 --> 00:07:11,836 Speaker 1: I made a fact. Now it only pays back if 123 00:07:11,876 --> 00:07:14,556 Speaker 1: I make a million of this car, right, because I 124 00:07:14,596 --> 00:07:16,596 Speaker 1: had to just drop one hundred and fifty million dollars 125 00:07:16,676 --> 00:07:20,116 Speaker 1: on just a stamping plant. So yeah, it's all over manufacturing. 126 00:07:20,156 --> 00:07:22,836 Speaker 1: We abstract this whole concept and gave it the name, 127 00:07:22,916 --> 00:07:24,036 Speaker 1: says economies, of scale. 128 00:07:24,556 --> 00:07:30,556 Speaker 2: Yeah, so you left SpaceX and you went to Relativity Space, right, 129 00:07:30,596 --> 00:07:33,356 Speaker 2: a company that was also in the space business that 130 00:07:33,516 --> 00:07:34,876 Speaker 2: was using three D printing. 131 00:07:34,916 --> 00:07:35,076 Speaker 1: Right. 132 00:07:35,076 --> 00:07:37,396 Speaker 2: That was the idea of the company, which seems like 133 00:07:37,596 --> 00:07:40,556 Speaker 2: an approach to this problem that you're talking about. 134 00:07:40,596 --> 00:07:40,716 Speaker 1: Right. 135 00:07:40,756 --> 00:07:43,996 Speaker 2: An advantage of three D printing is that it is 136 00:07:44,076 --> 00:07:48,156 Speaker 2: much more flexible and less rigid than traditional manufacturing. Right, 137 00:07:48,196 --> 00:07:49,476 Speaker 2: So tell me about that. 138 00:07:50,036 --> 00:07:54,236 Speaker 1: Yeah. So, yeah, we saw this challenge at SpaceX and 139 00:07:54,276 --> 00:07:56,356 Speaker 1: I joined Relativity very early on. I was the fourth 140 00:07:56,396 --> 00:08:01,476 Speaker 1: person on that team. And the goal over there was, Okay, 141 00:08:02,236 --> 00:08:05,316 Speaker 1: let's just think about this fundamentally, can we build a 142 00:08:05,396 --> 00:08:09,396 Speaker 1: rocket that all built with flexible technology and a time? 143 00:08:09,396 --> 00:08:14,476 Speaker 1: Three D printing was that forefront of everybody's minds because 144 00:08:14,556 --> 00:08:18,236 Speaker 1: people were already starting to build that. NASA SpaceX people 145 00:08:18,276 --> 00:08:21,036 Speaker 1: were already starting to build engines out of three D printing, 146 00:08:21,436 --> 00:08:23,916 Speaker 1: and the concept was like, well, that's great, it's very flexible. 147 00:08:24,036 --> 00:08:27,796 Speaker 1: Three D printing has this promise of geometry agnostic, material agnostic. 148 00:08:27,836 --> 00:08:29,396 Speaker 1: You can just feed it a design and can build 149 00:08:29,436 --> 00:08:31,716 Speaker 1: a product for you. And it worked very well with 150 00:08:31,796 --> 00:08:35,316 Speaker 1: rocket engines. I think probably the future all rocket engines 151 00:08:36,516 --> 00:08:39,356 Speaker 1: would be would be three D printed, And the concept was, 152 00:08:39,796 --> 00:08:43,996 Speaker 1: can we take this and scale it to the whole vehicle, right, 153 00:08:44,316 --> 00:08:46,476 Speaker 1: can we build the whole vehicle with a process like 154 00:08:46,476 --> 00:08:50,836 Speaker 1: three D printing so that it is flexible Today, if 155 00:08:50,836 --> 00:08:53,236 Speaker 1: you want to build a rocket with twelve foot diameter, 156 00:08:53,316 --> 00:08:55,636 Speaker 1: we can do it. And then if our calculation changes 157 00:08:55,676 --> 00:08:57,356 Speaker 1: and we wanted to go to another orbit or do 158 00:08:57,356 --> 00:08:59,916 Speaker 1: a different type of emission, then we can change that 159 00:09:00,036 --> 00:09:02,996 Speaker 1: diet twelve diameter to twenty diameter twenty fie. 160 00:09:03,076 --> 00:09:04,796 Speaker 2: Don't have to build a new factory, don't have to 161 00:09:04,796 --> 00:09:07,556 Speaker 2: build new machinery, just change three D printer. 162 00:09:07,716 --> 00:09:12,676 Speaker 1: Yeah, exactly. So that was a concept behind relativity. That's 163 00:09:12,676 --> 00:09:15,316 Speaker 1: a thesis behind relativity, and that was the goal there. 164 00:09:15,396 --> 00:09:17,396 Speaker 1: The goal was, you know, three D print a whole 165 00:09:17,476 --> 00:09:18,796 Speaker 1: rocket so they can be flexible. 166 00:09:19,916 --> 00:09:22,476 Speaker 2: But it hasn't. It hasn't worked at least in the 167 00:09:22,556 --> 00:09:25,276 Speaker 2: kind of maximalist version, right, Like, they just haven't been 168 00:09:25,316 --> 00:09:27,556 Speaker 2: able to do it. They've they've sort of backed off 169 00:09:27,596 --> 00:09:29,996 Speaker 2: of that that big dream, as I understand it. 170 00:09:30,156 --> 00:09:36,396 Speaker 1: Yeah, yeah, So I think the challenge was that three 171 00:09:36,436 --> 00:09:39,276 Speaker 1: D printing is just one process and it's necessarily not 172 00:09:39,396 --> 00:09:44,276 Speaker 1: good for every type of part. You know, manufacturing is 173 00:09:44,356 --> 00:09:47,036 Speaker 1: very versatile. You do different types of geometries, different types 174 00:09:47,076 --> 00:09:50,036 Speaker 1: of material, and three D printing has a very small reach. 175 00:09:50,196 --> 00:09:53,556 Speaker 1: There's certain type of parts like rocket engines, very good fit. 176 00:09:54,036 --> 00:09:58,316 Speaker 1: You're building a tank, maybe not so right. So yeah 177 00:09:58,356 --> 00:10:00,196 Speaker 1: it's good for certain type of parts, but there as a 178 00:10:00,236 --> 00:10:01,876 Speaker 1: whole lot of other parts. Like I said, you know 179 00:10:01,916 --> 00:10:05,196 Speaker 1: you're building a fuel tank, which is basically large sheet 180 00:10:05,276 --> 00:10:09,516 Speaker 1: metal or thin walled structure, then maybe three D printing 181 00:10:09,556 --> 00:10:11,196 Speaker 1: is not as good as a fit because it takes 182 00:10:11,236 --> 00:10:14,196 Speaker 1: a long time, and also because it's thin, you have 183 00:10:14,236 --> 00:10:16,716 Speaker 1: a lot of physical challenges in terms of controlling the 184 00:10:16,756 --> 00:10:22,916 Speaker 1: geometry and the tolerances. So we realize soon that maybe 185 00:10:23,636 --> 00:10:27,356 Speaker 1: other processes are also need to be automated the same 186 00:10:27,356 --> 00:10:29,916 Speaker 1: way three D printing is. We need to have more 187 00:10:30,236 --> 00:10:34,276 Speaker 1: flexible processes that are not just one process, more flexible platforms. 188 00:10:34,276 --> 00:10:36,756 Speaker 1: They can do different types of processes, not just three 189 00:10:36,836 --> 00:10:41,956 Speaker 1: D printing, to be able to cover a whole variety 190 00:10:41,996 --> 00:10:44,436 Speaker 1: of products in a flexible manner, the same way the 191 00:10:44,436 --> 00:10:47,596 Speaker 1: three D printing that's for certain type of products. And 192 00:10:47,636 --> 00:10:50,476 Speaker 1: that was actually the thinking behind MARKETA Labs is that, okay, 193 00:10:50,636 --> 00:10:53,276 Speaker 1: can we step back and say, what do we need 194 00:10:53,316 --> 00:10:56,956 Speaker 1: to build? What is this flexible platform that can do 195 00:10:57,156 --> 00:10:59,556 Speaker 1: three D printing if needed, or it can do sheet 196 00:10:59,596 --> 00:11:01,996 Speaker 1: forming if it's needed, It can do machining if it's needed, 197 00:11:02,916 --> 00:11:06,476 Speaker 1: but chooses the right operation, right flexible operation for the 198 00:11:06,516 --> 00:11:09,436 Speaker 1: right part, but still very agile and doesn't require a 199 00:11:09,476 --> 00:11:11,236 Speaker 1: lot of tooling and it's not inflexible. 200 00:11:12,276 --> 00:11:14,916 Speaker 2: So it's it's sort of zooming out more. It's saying 201 00:11:15,676 --> 00:11:18,276 Speaker 2: three D printing is not going to do everything the 202 00:11:18,276 --> 00:11:21,476 Speaker 2: way manufacturing works now. It's just too rigid, too hard 203 00:11:21,476 --> 00:11:24,876 Speaker 2: to change things, to rely on on scale to make 204 00:11:24,916 --> 00:11:28,316 Speaker 2: the economics work out. So like that's a very big, 205 00:11:28,596 --> 00:11:33,876 Speaker 2: very abstract thought. To start a company, you got to 206 00:11:33,916 --> 00:11:36,196 Speaker 2: make something or you got to make something that makes 207 00:11:36,236 --> 00:11:39,236 Speaker 2: something like what do you what do you actually do? 208 00:11:39,836 --> 00:11:44,756 Speaker 1: Yeah? So it was interesting, right, you know we actually 209 00:11:44,756 --> 00:11:47,956 Speaker 1: the solution was in our past. Right if you look 210 00:11:47,996 --> 00:11:48,716 Speaker 1: at like. 211 00:11:48,636 --> 00:11:50,636 Speaker 2: The lesson in a movie, it's like the Wizard of 212 00:11:50,636 --> 00:11:52,596 Speaker 2: Oz or something exactly. 213 00:11:53,316 --> 00:11:56,876 Speaker 1: If you look at manufacturing, I mean up to Industrial Revolution, 214 00:11:56,996 --> 00:12:00,716 Speaker 1: it was arts and crafts. Right, it was basically humans 215 00:12:01,276 --> 00:12:03,516 Speaker 1: trying to figure out how to conquer nature, right, Like, 216 00:12:03,636 --> 00:12:06,356 Speaker 1: how am I gonna use my hands? On my brains 217 00:12:06,356 --> 00:12:11,156 Speaker 1: and very few primitive tools to deform a product or 218 00:12:11,396 --> 00:12:16,196 Speaker 1: shape a product from raw material. Right and to this state, 219 00:12:16,396 --> 00:12:19,196 Speaker 1: if you are in a very high mixed manufacturing still 220 00:12:19,236 --> 00:12:22,516 Speaker 1: a lot of that creativity exists. There is a person 221 00:12:22,636 --> 00:12:25,396 Speaker 1: at Space IX. His name is Big John. I don't 222 00:12:25,396 --> 00:12:27,956 Speaker 1: think he's there anymore, but there was this guy. It 223 00:12:28,076 --> 00:12:32,236 Speaker 1: was like, you know, a very skilled maker, a craftsman. 224 00:12:32,396 --> 00:12:35,196 Speaker 1: You could figure out how to use simpler tools to 225 00:12:35,196 --> 00:12:37,756 Speaker 1: build different things in a creative way. Maybe it's not 226 00:12:37,796 --> 00:12:41,396 Speaker 1: a repeatable way like you know a stamping works or 227 00:12:41,436 --> 00:12:44,396 Speaker 1: ejection molding works, but you can be flexible. You can 228 00:12:44,396 --> 00:12:46,036 Speaker 1: do different types of things. You can be creative about 229 00:12:46,076 --> 00:12:47,956 Speaker 1: it and do different type of things. So the inspiration 230 00:12:48,076 --> 00:12:52,036 Speaker 1: came from how actually humans used to do manufacturing but 231 00:12:52,196 --> 00:12:56,476 Speaker 1: realized in order to be flexible, you actually need two components. 232 00:12:56,516 --> 00:13:00,516 Speaker 1: You need intelligence and you need set of simple tools 233 00:13:00,716 --> 00:13:03,156 Speaker 1: with a lot of kinematic freedom. Now you can pick 234 00:13:03,196 --> 00:13:05,236 Speaker 1: up those simple tools, and as long as you have 235 00:13:05,236 --> 00:13:07,916 Speaker 1: the intelligence on how to use tools and what sequence 236 00:13:07,956 --> 00:13:11,156 Speaker 1: and what kind of a process how to use those tools, 237 00:13:11,156 --> 00:13:13,796 Speaker 1: you can actually do a whole variety of projects. 238 00:13:14,156 --> 00:13:16,636 Speaker 2: And so when you say kinematic freedom, you basically mean 239 00:13:16,716 --> 00:13:19,956 Speaker 2: like robot arms that can move in lots of different ways. 240 00:13:20,076 --> 00:13:22,876 Speaker 2: Is that practically what kinematic freedom means in this context? 241 00:13:23,036 --> 00:13:26,436 Speaker 1: Yes, basically can apply these tools in a lot with 242 00:13:26,516 --> 00:13:28,596 Speaker 1: a lot of freedom to the material, right the same 243 00:13:28,596 --> 00:13:31,076 Speaker 1: way humans humans do it, right, you know as a human, 244 00:13:31,676 --> 00:13:33,596 Speaker 1: you know, if you think about it, you can pick 245 00:13:33,676 --> 00:13:36,516 Speaker 1: up a welder and weld something, and then you drop 246 00:13:36,556 --> 00:13:38,156 Speaker 1: the welder, and you pick up a drill and you 247 00:13:38,196 --> 00:13:40,116 Speaker 1: put a hole in it, and you drop to drill 248 00:13:40,436 --> 00:13:43,676 Speaker 1: and you pick up a you know, hammer and maybe 249 00:13:43,756 --> 00:13:46,276 Speaker 1: hammer it into shape. So you actually have a few 250 00:13:46,276 --> 00:13:48,196 Speaker 1: set of tools, but you have a lot of good 251 00:13:48,276 --> 00:13:52,396 Speaker 1: kinematic freedom and most importantly, very creative mind too tells 252 00:13:52,436 --> 00:13:55,236 Speaker 1: you how to apply these tools to the material, so 253 00:13:55,236 --> 00:13:58,516 Speaker 1: they can actually get very complex set of products and 254 00:13:58,876 --> 00:13:59,636 Speaker 1: a lot of diversity. 255 00:13:59,756 --> 00:14:03,076 Speaker 2: So plainly, instead of big John, you want a robot, right, 256 00:14:03,156 --> 00:14:06,916 Speaker 2: That's where that's the kinematic freedom. The tools are kind 257 00:14:06,916 --> 00:14:09,996 Speaker 2: of like old tools, but optimized for the root. And 258 00:14:10,036 --> 00:14:12,996 Speaker 2: then when you say intelligence, that's the one where it's 259 00:14:13,036 --> 00:14:15,476 Speaker 2: like feels more frontier ish, like does that mean like 260 00:14:15,956 --> 00:14:19,236 Speaker 2: clever engineers figuring out how to automate the robots doesn't 261 00:14:19,236 --> 00:14:20,836 Speaker 2: mean AI? Does it mean both? 262 00:14:21,316 --> 00:14:24,076 Speaker 1: Yeah? So I think yeah, you're basically getting to the 263 00:14:24,076 --> 00:14:27,196 Speaker 1: crux of how do you scale it? Right? You need 264 00:14:27,196 --> 00:14:31,476 Speaker 1: to have those three components, and how does the intelligent piece, 265 00:14:31,516 --> 00:14:34,076 Speaker 1: which is the most important piece, comes into play in 266 00:14:34,116 --> 00:14:37,836 Speaker 1: an automated fashion. So early days we started from basic 267 00:14:37,836 --> 00:14:40,556 Speaker 1: intelligence of humans. But then we had a plan to 268 00:14:40,596 --> 00:14:43,836 Speaker 1: capture data and train AI so that you can replace 269 00:14:44,836 --> 00:14:47,196 Speaker 1: the thinking and the creativity that human had to put in. 270 00:14:47,636 --> 00:14:50,356 Speaker 2: What's the first thing you decided to try and build. 271 00:14:50,356 --> 00:14:52,196 Speaker 2: What's the first sort of problem you want to solve? 272 00:14:52,556 --> 00:14:54,796 Speaker 1: Yeah? I think so. I left relativity in twenty eighteen, 273 00:14:55,156 --> 00:14:57,916 Speaker 1: and the idea when I left relativity was there, right. 274 00:14:57,956 --> 00:15:00,476 Speaker 1: I was like, okay, we need to build basically what 275 00:15:00,556 --> 00:15:03,596 Speaker 1: I had in My mom called it robot craftsmen. Robocraftsman, 276 00:15:03,636 --> 00:15:05,316 Speaker 1: we call it a time. How can you build a 277 00:15:05,436 --> 00:15:07,756 Speaker 1: robot system? To your point, you can pick up different tools, 278 00:15:07,756 --> 00:15:09,476 Speaker 1: has the same king mean, but also have to have 279 00:15:09,476 --> 00:15:13,516 Speaker 1: the intelligence. The challenge is you know you said, in 280 00:15:13,596 --> 00:15:15,796 Speaker 1: order to train these robots with AI, you need to 281 00:15:15,796 --> 00:15:18,076 Speaker 1: have a lot of data. And this is not the 282 00:15:18,156 --> 00:15:19,916 Speaker 1: data you can find on internet. 283 00:15:20,116 --> 00:15:23,476 Speaker 2: Right, this is the AI robotics problem, it seems right, 284 00:15:23,556 --> 00:15:26,676 Speaker 2: like unlike with large language models, like that's why we 285 00:15:26,756 --> 00:15:30,036 Speaker 2: have large language models and not AI robots, right because 286 00:15:30,836 --> 00:15:33,476 Speaker 2: because we have the data just sort of randomly sitting 287 00:15:33,476 --> 00:15:35,516 Speaker 2: around on the Internet, and we don't have that physical 288 00:15:35,516 --> 00:15:37,036 Speaker 2: world data for robots. 289 00:15:36,716 --> 00:15:41,036 Speaker 1: Right exactly. So basically the problem narrowed down into Okay, 290 00:15:41,556 --> 00:15:44,476 Speaker 1: how can I generate enough data? How can I create 291 00:15:44,476 --> 00:15:47,236 Speaker 1: a business that has a sustaining way of generating data 292 00:15:47,636 --> 00:15:49,996 Speaker 1: so I can actually build these models, I can build 293 00:15:49,996 --> 00:15:54,156 Speaker 1: this intelligence for these robots. And the thinking was, Okay, 294 00:15:54,876 --> 00:15:58,516 Speaker 1: I need to create a solution that can scale in 295 00:15:58,556 --> 00:16:03,716 Speaker 1: the industry with limited amount of data and some heuristic. 296 00:16:04,876 --> 00:16:07,116 Speaker 1: But then because it's scaling, we can generate a lot 297 00:16:07,116 --> 00:16:09,036 Speaker 1: of data and it starts building AI mods. 298 00:16:09,436 --> 00:16:12,156 Speaker 2: Right. You need a first thing that you can actually 299 00:16:12,236 --> 00:16:15,996 Speaker 2: do before you really have AI, to generate the data 300 00:16:15,996 --> 00:16:16,876 Speaker 2: that will get you to. 301 00:16:17,596 --> 00:16:22,236 Speaker 1: AI exactly exactly. So we're thinking about, Okay, it needs 302 00:16:22,276 --> 00:16:25,676 Speaker 1: to be a large enough market right where we can 303 00:16:25,716 --> 00:16:28,396 Speaker 1: get mass adoption, and we need to solve a problem 304 00:16:28,476 --> 00:16:30,916 Speaker 1: that's big enough it's ten times at least better than 305 00:16:30,916 --> 00:16:34,236 Speaker 1: the current solution so it can actually get adoption, right. 306 00:16:34,756 --> 00:16:36,876 Speaker 2: Meaning you can't just do something as well, you have 307 00:16:36,916 --> 00:16:38,276 Speaker 2: to do it ten times better. 308 00:16:38,876 --> 00:16:42,156 Speaker 1: Yeah, Because I think what we realize is that through 309 00:16:42,516 --> 00:16:45,116 Speaker 1: the last two companies, if something is not ten times better, 310 00:16:45,236 --> 00:16:49,196 Speaker 1: cannot overcome the inertia that exists in an industry for 311 00:16:49,236 --> 00:16:51,796 Speaker 1: adoption because you know, if you're doing something for the 312 00:16:51,916 --> 00:16:54,116 Speaker 1: same way, and in manufacturing, people have been doing things 313 00:16:54,476 --> 00:16:56,716 Speaker 1: the old way for hundred of years, right. 314 00:16:56,676 --> 00:16:58,836 Speaker 2: Yeah, and it's a risk, right if they're going to 315 00:16:58,876 --> 00:17:01,796 Speaker 2: try working with you, they're immediately taking a risk. And 316 00:17:01,836 --> 00:17:03,636 Speaker 2: if it's only going to be a little better, why 317 00:17:03,636 --> 00:17:04,836 Speaker 2: should I take that risk? 318 00:17:05,276 --> 00:17:08,436 Speaker 1: Exactly? So the idea was, Okay, we need to find 319 00:17:08,476 --> 00:17:11,356 Speaker 1: it large enough market for our first application, and we 320 00:17:11,396 --> 00:17:13,436 Speaker 1: need to have a solution that at least ten times better. 321 00:17:13,756 --> 00:17:15,436 Speaker 1: So that landed us. We actually looked at a lot 322 00:17:15,436 --> 00:17:17,876 Speaker 1: of things, from three D printing to forging to a 323 00:17:17,876 --> 00:17:20,036 Speaker 1: lot of things, and then landed on sheet metal. So 324 00:17:20,076 --> 00:17:23,476 Speaker 1: sheet metal is the largest metal processing sector out of 325 00:17:23,476 --> 00:17:26,316 Speaker 1: all It's a two hundred and eighty billion dollar industry today, 326 00:17:27,236 --> 00:17:30,956 Speaker 1: and forming complex sheet metal shapes is very tool intensive. 327 00:17:31,276 --> 00:17:34,956 Speaker 1: So so what we started to do was, okay, can 328 00:17:35,036 --> 00:17:37,876 Speaker 1: we make our robot craftsman's first operation to be forming 329 00:17:37,916 --> 00:17:40,756 Speaker 1: sheet metal, basically forming sheet metal the same way a 330 00:17:40,796 --> 00:17:43,076 Speaker 1: sheet shaper hammer is a sheet into shape. 331 00:17:43,276 --> 00:17:45,516 Speaker 2: And when I think about sheet metal, I mean I 332 00:17:45,556 --> 00:17:47,276 Speaker 2: don't know anything about sheet metal. I think of like 333 00:17:48,036 --> 00:17:50,556 Speaker 2: I think of cars, I think of planes, right, I 334 00:17:50,596 --> 00:17:53,356 Speaker 2: think of like you know, detroit, like stamping. 335 00:17:53,436 --> 00:17:53,676 Speaker 1: Is that? 336 00:17:53,716 --> 00:17:55,876 Speaker 2: Am I thinking about the right thing? Am I missing 337 00:17:55,996 --> 00:17:57,716 Speaker 2: huge or huge sheet metal universe? 338 00:17:57,756 --> 00:17:57,836 Speaker 1: Like? 339 00:17:57,876 --> 00:17:58,996 Speaker 2: What's the sheet metal universe? 340 00:17:59,156 --> 00:18:01,836 Speaker 1: Yes? So sheep metal almost is everywhere. I think is 341 00:18:01,836 --> 00:18:06,356 Speaker 1: the most common metal part that you see on day 342 00:18:06,356 --> 00:18:08,876 Speaker 1: to day, right, because most of the time we use 343 00:18:08,916 --> 00:18:11,756 Speaker 1: metal to be a container for other things. So it's 344 00:18:11,836 --> 00:18:15,396 Speaker 1: usually a thin metal structure that's formed in complex shape 345 00:18:15,676 --> 00:18:17,916 Speaker 1: to hold something else. Now you know it can be 346 00:18:17,956 --> 00:18:23,556 Speaker 1: from case of a computer. Uh, you know, to a car, right, 347 00:18:23,596 --> 00:18:25,276 Speaker 1: you know you're sitting in a freeway you're in to 348 00:18:25,316 --> 00:18:28,156 Speaker 1: see a sheet metal or to a to a airplane 349 00:18:28,196 --> 00:18:31,316 Speaker 1: you're in a sheet metal can to a rocket body. 350 00:18:31,716 --> 00:18:31,836 Speaker 2: Uh. 351 00:18:32,116 --> 00:18:33,876 Speaker 1: For for a lot of rocket someone will composites with 352 00:18:33,876 --> 00:18:39,636 Speaker 1: a lot of a machine metal. And to agricultural heavy 353 00:18:39,636 --> 00:18:45,476 Speaker 1: equipment machinery you think of combines, tractors to even building equipments. 354 00:18:45,516 --> 00:18:50,396 Speaker 1: You look at your h ract ducts are all sheet metal, right, 355 00:18:50,476 --> 00:18:53,316 Speaker 1: because it just makes sense. It's we mostly use metal 356 00:18:53,316 --> 00:18:56,156 Speaker 1: parts to contain other things, and we give it complex 357 00:18:56,196 --> 00:18:58,756 Speaker 1: shapes and that's where she forming comes into play. So 358 00:18:58,796 --> 00:19:01,796 Speaker 1: you pretty much see it everywhere. But the challenge is 359 00:19:01,796 --> 00:19:04,836 Speaker 1: that in almost in all cases, you have to create tooling. 360 00:19:04,876 --> 00:19:06,316 Speaker 1: It goes back to that first problem. He said, you 361 00:19:06,396 --> 00:19:09,476 Speaker 1: have to create tooling for each of those geometries. And 362 00:19:09,476 --> 00:19:12,996 Speaker 1: that's why you know a Ford needs to make sure 363 00:19:13,116 --> 00:19:15,796 Speaker 1: they can sell a million of an FUN fifty before 364 00:19:15,796 --> 00:19:18,036 Speaker 1: they can invest in a plant that makes a new 365 00:19:18,116 --> 00:19:19,676 Speaker 1: version of FN fifty, right. 366 00:19:19,556 --> 00:19:24,356 Speaker 2: Because you basically have to build a bespoke factory just 367 00:19:24,436 --> 00:19:28,236 Speaker 2: to shape sheet metal in a new way exactly for 368 00:19:28,276 --> 00:19:30,836 Speaker 2: a new geometry, for for a new design. Exactly where 369 00:19:30,876 --> 00:19:34,356 Speaker 2: is that a particular problem? Like where is it? Where 370 00:19:34,396 --> 00:19:36,916 Speaker 2: is that? Where does that acutely bind the fact that 371 00:19:37,156 --> 00:19:39,236 Speaker 2: sheet metal is so hard to do if you're not 372 00:19:39,276 --> 00:19:39,756 Speaker 2: working at. 373 00:19:39,716 --> 00:19:42,716 Speaker 1: Scale so expensive? Yeah, so I think now you're coming 374 00:19:42,716 --> 00:19:45,876 Speaker 1: to the even the third stage of how do you 375 00:19:45,916 --> 00:19:48,476 Speaker 1: scale this technology? You need to first find you know, 376 00:19:48,476 --> 00:19:49,796 Speaker 1: you said you need to be ten ex better we 377 00:19:49,836 --> 00:19:53,036 Speaker 1: need Right, you're in an area that has a lot 378 00:19:53,076 --> 00:19:54,236 Speaker 1: of pain with today's time. 379 00:19:54,316 --> 00:19:56,436 Speaker 2: I was like, oh my god, thank god, you've walked 380 00:19:56,476 --> 00:19:58,236 Speaker 2: through the door. We've been waiting for you. 381 00:19:58,636 --> 00:20:02,436 Speaker 1: Yeah, So end up being very much defense in airspace. Right, 382 00:20:02,516 --> 00:20:06,756 Speaker 1: So think of you know, think of our military for example, 383 00:20:07,196 --> 00:20:12,836 Speaker 1: right today, they we have fifty sixty different weapon system 384 00:20:13,116 --> 00:20:15,956 Speaker 1: or defense systems you can basically think of like aircrafts 385 00:20:17,236 --> 00:20:19,916 Speaker 1: that they're maintaining. And some of these systems have been 386 00:20:19,916 --> 00:20:23,036 Speaker 1: built from sixty seventy eighty years ago, like think of 387 00:20:23,116 --> 00:20:26,156 Speaker 1: B fifty two C one thirty like World War. 388 00:20:25,996 --> 00:20:27,916 Speaker 2: Two planes still flying kind. 389 00:20:27,756 --> 00:20:31,836 Speaker 1: Of still flying, yes, exactly, and they have like you know, 390 00:20:31,996 --> 00:20:35,076 Speaker 1: thirty of one, fifty of another, one hundred of another one. 391 00:20:35,396 --> 00:20:38,076 Speaker 1: And these things get break down, right, and unlike a 392 00:20:38,116 --> 00:20:41,996 Speaker 1: Ford factory, there is no factory for seventy different products 393 00:20:42,036 --> 00:20:43,156 Speaker 1: that they're carrying. 394 00:20:42,876 --> 00:20:46,156 Speaker 2: Right, and presumably the factory they built in nineteen forty 395 00:20:46,196 --> 00:20:48,076 Speaker 2: one to build this plane doesn't. 396 00:20:47,796 --> 00:20:50,716 Speaker 1: Exist any It doesn't exact. Even the vendor might completely 397 00:20:50,756 --> 00:20:54,356 Speaker 1: have disappeared, right, that made that misspecific component. So they're 398 00:20:54,396 --> 00:20:57,796 Speaker 1: constantly battling with this challenge of an aircraft goes down, 399 00:20:58,036 --> 00:20:59,876 Speaker 1: how can I fix it? How can I find the part? 400 00:20:59,916 --> 00:21:02,276 Speaker 1: And there are thousands of parts in each of these aircrafts, right, 401 00:21:02,316 --> 00:21:04,436 Speaker 1: so any of them can go down, and that's a 402 00:21:04,516 --> 00:21:06,836 Speaker 1: huge challenge. I mean, if you look at you know, 403 00:21:06,916 --> 00:21:11,436 Speaker 1: government of a government accountabilit The office put this report out, 404 00:21:11,756 --> 00:21:13,116 Speaker 1: I think it was a couple of years ago or 405 00:21:13,156 --> 00:21:16,676 Speaker 1: a year ago about how ready each weapon system is 406 00:21:16,716 --> 00:21:19,436 Speaker 1: to defend the United States. Out of the forty eight 407 00:21:19,476 --> 00:21:22,836 Speaker 1: to forty nine weapon systems they look into only one, 408 00:21:23,196 --> 00:21:28,196 Speaker 1: only one in the past eleven years, every year was ready, right. 409 00:21:29,076 --> 00:21:31,996 Speaker 1: I think only top four had like at least half 410 00:21:32,076 --> 00:21:35,236 Speaker 1: of the years ready right. So that means in most 411 00:21:35,316 --> 00:21:38,036 Speaker 1: years these weapons are not ready to fight, like. 412 00:21:37,996 --> 00:21:39,156 Speaker 2: They're waiting for parts. 413 00:21:39,196 --> 00:21:42,236 Speaker 1: They're waiting for parts. Something is broken, something is damaged, 414 00:21:42,596 --> 00:21:44,756 Speaker 1: and we used to go deeper. Some of these components 415 00:21:44,756 --> 00:21:47,156 Speaker 1: take four years to be replaced. So if a plane 416 00:21:47,156 --> 00:21:49,636 Speaker 1: gets damaged, it needs to sit on the ground for 417 00:21:49,676 --> 00:21:51,996 Speaker 1: four years before it can be it can be replaced, 418 00:21:52,076 --> 00:21:55,196 Speaker 1: and the cost of replacement is building another factory basically, 419 00:21:55,436 --> 00:21:57,396 Speaker 1: So some of these parts, and think of it, a 420 00:21:57,476 --> 00:22:00,316 Speaker 1: landing gear door that goes on a plane will cost 421 00:22:00,316 --> 00:22:02,556 Speaker 1: them eight hundred thousand dollars for example, because they have. 422 00:22:02,516 --> 00:22:05,796 Speaker 2: To go make it because it's bespoke essentially, like buying 423 00:22:05,836 --> 00:22:08,316 Speaker 2: a bespoke suit or something. It's just like it's gonna 424 00:22:08,356 --> 00:22:08,796 Speaker 2: cost a lot. 425 00:22:08,876 --> 00:22:11,556 Speaker 1: Yeah, yeah, So the idea started there. I think that 426 00:22:11,636 --> 00:22:14,596 Speaker 1: was one of our first customers. Can we make defense 427 00:22:14,636 --> 00:22:19,036 Speaker 1: manufacturing more agile? Directly affects our national readiness for military 428 00:22:19,076 --> 00:22:22,156 Speaker 1: conflict and it's a huge problem. But then you know, 429 00:22:22,276 --> 00:22:24,996 Speaker 1: even in a broader sense, any defense product or aerospace 430 00:22:25,076 --> 00:22:28,916 Speaker 1: product usually has very low volume but high mix of products. 431 00:22:29,196 --> 00:22:31,516 Speaker 1: You know, even you know, you're building a missile, you 432 00:22:31,556 --> 00:22:33,396 Speaker 1: make like, you know, a few thousand a year, and 433 00:22:33,436 --> 00:22:35,796 Speaker 1: you might make five, six seven different versions of right, 434 00:22:35,796 --> 00:22:38,396 Speaker 1: So it's very unlike cars, where you know, you make 435 00:22:38,436 --> 00:22:40,156 Speaker 1: a million of the same car over and over all. 436 00:22:40,716 --> 00:22:44,116 Speaker 1: So that ends up being our first application, which we've 437 00:22:44,116 --> 00:22:47,636 Speaker 1: got a lot of traction with. But but you know, 438 00:22:47,676 --> 00:22:50,156 Speaker 1: even outside of that, you know, you look at companies 439 00:22:50,316 --> 00:22:53,796 Speaker 1: like Caterpillar, like John Deere's of the world. These folks 440 00:22:53,836 --> 00:22:55,556 Speaker 1: also are in the same book. You know, they make 441 00:22:55,756 --> 00:22:58,676 Speaker 1: two hundred combines, right, but they need to support them 442 00:22:58,676 --> 00:23:01,316 Speaker 1: in the field. And these folks have the exactly same problem, right, 443 00:23:01,396 --> 00:23:05,396 Speaker 1: you know, do I need to run a large factory 444 00:23:05,956 --> 00:23:08,156 Speaker 1: to support all these models at all time, and that's 445 00:23:08,196 --> 00:23:11,716 Speaker 1: will be very expensive to support, like one hundred vehicles out. 446 00:23:11,556 --> 00:23:17,436 Speaker 2: There still to come on the show. We'll talk about 447 00:23:17,436 --> 00:23:20,836 Speaker 2: the future of AI and robotics at Mocking the Labs 448 00:23:20,876 --> 00:23:33,396 Speaker 2: and beyond. And so you got the right market. Now 449 00:23:33,396 --> 00:23:35,556 Speaker 2: you've got to make a thing. You got to figure 450 00:23:35,556 --> 00:23:37,916 Speaker 2: out how to actually do the thing, how to make 451 00:23:37,916 --> 00:23:40,956 Speaker 2: your idea come true? Like how does that work? 452 00:23:41,436 --> 00:23:44,796 Speaker 1: So the idea originally was can we get rid of 453 00:23:44,796 --> 00:23:47,436 Speaker 1: a die right and do it the same way a 454 00:23:47,516 --> 00:23:50,876 Speaker 1: sheet shaper forms a sheet of metal? And what does 455 00:23:50,876 --> 00:23:52,876 Speaker 1: a sheet chaper do? Sheet chaper get starts from a 456 00:23:52,916 --> 00:23:56,516 Speaker 1: flat sheet of metal and it slowly hammers it into shape. 457 00:23:57,116 --> 00:23:59,076 Speaker 1: So what we wanted to do was have a robot 458 00:23:59,196 --> 00:24:03,116 Speaker 1: do that, right, have robotic system basically do that incremental 459 00:24:03,156 --> 00:24:04,956 Speaker 1: deformation into shape. We call it romophor. 460 00:24:05,716 --> 00:24:07,516 Speaker 2: So you're sort of bending it, right, I mean you're 461 00:24:07,516 --> 00:24:09,996 Speaker 2: hammering it. It's sort of like if you take a whatever, 462 00:24:10,076 --> 00:24:11,916 Speaker 2: cut open a luminum can and kind of bend it 463 00:24:11,956 --> 00:24:15,676 Speaker 2: into shape. Like that's a version of what's happening here, right, Yes, 464 00:24:16,116 --> 00:24:17,996 Speaker 2: exactly complicated way, yeah. 465 00:24:17,716 --> 00:24:20,996 Speaker 1: Exactly, you're right, I mean the same way a potter 466 00:24:22,036 --> 00:24:24,436 Speaker 1: forms a clay bowl. That's basically what our robots do. 467 00:24:24,476 --> 00:24:26,796 Speaker 1: They start from a flash sheet of metal and slowly 468 00:24:26,836 --> 00:24:28,876 Speaker 1: deforming in the shape the same way a potter, which 469 00:24:28,996 --> 00:24:31,556 Speaker 1: form it clay bowl or a sheet shape of hammets 470 00:24:31,556 --> 00:24:33,876 Speaker 1: and hammers a sheet into shape. Yeah, so I've seen 471 00:24:33,916 --> 00:24:34,116 Speaker 1: it right. 472 00:24:34,156 --> 00:24:35,876 Speaker 2: So there will be a sheet of metal like hanging 473 00:24:36,076 --> 00:24:39,156 Speaker 2: hanging up in whatever above the ground. And then you 474 00:24:39,156 --> 00:24:42,316 Speaker 2: have a robot arm on either side, right, like one 475 00:24:42,356 --> 00:24:44,916 Speaker 2: on one side, one on the other. And then what. 476 00:24:44,956 --> 00:24:48,036 Speaker 1: Happens Basically the robots come together from both sides the 477 00:24:48,036 --> 00:24:51,996 Speaker 1: sheet and they pinch the sheet in a certain way 478 00:24:52,196 --> 00:24:57,636 Speaker 1: so that that location that they're pinching slightly stretches and deforms. Right. 479 00:24:57,876 --> 00:25:02,636 Speaker 1: And if you start applying this pinching all over the 480 00:25:02,636 --> 00:25:05,796 Speaker 1: sheet and incrementally, you slowly start to form it into 481 00:25:05,836 --> 00:25:09,116 Speaker 1: a shape. Right. So instead of traditionally would use it 482 00:25:09,276 --> 00:25:12,236 Speaker 1: die and with sheer pressure of the press pushing the 483 00:25:12,236 --> 00:25:14,116 Speaker 1: sheet against the dye to give it a shape. Now 484 00:25:14,116 --> 00:25:17,196 Speaker 1: the robots are like a craftsman, like a trades person 485 00:25:17,516 --> 00:25:20,716 Speaker 1: coming in to slowly deform the sheet into shape by 486 00:25:20,796 --> 00:25:23,116 Speaker 1: just applying pressure. So one robot is pushing it the 487 00:25:23,156 --> 00:25:26,756 Speaker 1: other robot is supporting it, and by applying a pinch 488 00:25:26,836 --> 00:25:29,916 Speaker 1: you slightly stretch the material and you form it into 489 00:25:29,956 --> 00:25:30,476 Speaker 1: a shape. 490 00:25:30,716 --> 00:25:33,876 Speaker 2: So I mean, the way you describe it, it makes 491 00:25:33,956 --> 00:25:38,996 Speaker 2: sense and it sounds easy. I'm sure it wasn't easy, 492 00:25:39,276 --> 00:25:43,876 Speaker 2: Like were there things that just didn't work for a while. 493 00:25:44,236 --> 00:25:47,196 Speaker 1: So you should have been here when the first time 494 00:25:47,236 --> 00:25:50,236 Speaker 1: we actually tried to form a part, the part looked 495 00:25:50,316 --> 00:25:53,156 Speaker 1: like it was like a ghost of the geometry that 496 00:25:53,196 --> 00:25:55,076 Speaker 1: they wanted to make, and actually in the end it 497 00:25:55,156 --> 00:25:58,556 Speaker 1: tore right. So think about it. You have this very 498 00:25:58,596 --> 00:26:03,036 Speaker 1: flimsy sheet applying pressure to it, and if features apply 499 00:26:03,116 --> 00:26:08,036 Speaker 1: pressure slightly wrong right, it can potentially tear it. It 500 00:26:08,076 --> 00:26:10,356 Speaker 1: can form it into a different shape. And also the 501 00:26:10,396 --> 00:26:12,556 Speaker 1: whole sheet is moving the whole time you're trying to 502 00:26:12,596 --> 00:26:15,316 Speaker 1: move to form it. The whole sheet is moving because 503 00:26:15,316 --> 00:26:18,636 Speaker 1: it's very flimsy. It's not a rigid structure, right. So 504 00:26:19,276 --> 00:26:22,116 Speaker 1: the main challenge was how do you get this accurate? Right? 505 00:26:22,196 --> 00:26:23,676 Speaker 1: How do you get this process accurate? How do you 506 00:26:23,676 --> 00:26:27,116 Speaker 1: get accuracy? And the idea was what does the robot 507 00:26:27,156 --> 00:26:30,436 Speaker 1: need to do given all of these chaotic nature of 508 00:26:30,476 --> 00:26:32,436 Speaker 1: the process where the sheet moves and if you apply 509 00:26:32,476 --> 00:26:35,116 Speaker 1: it too much pressure, it will deform in a bad 510 00:26:35,116 --> 00:26:38,276 Speaker 1: way or in my tear. If you're probably not enough pressure, 511 00:26:38,316 --> 00:26:40,916 Speaker 1: it might just not form. So how do you come 512 00:26:40,996 --> 00:26:43,436 Speaker 1: up with the right set of robot movements and process 513 00:26:43,436 --> 00:26:46,556 Speaker 1: parameters to form the part? And that was the problem 514 00:26:46,596 --> 00:26:48,956 Speaker 1: we want to solve with AI right, but we didn't 515 00:26:48,996 --> 00:26:51,676 Speaker 1: have the data right right in the beginning. Right. The 516 00:26:52,036 --> 00:26:56,076 Speaker 1: idea was that if I form enough parts with this process, 517 00:26:56,196 --> 00:26:58,796 Speaker 1: and I can capture all the data throughout the process, 518 00:26:58,796 --> 00:27:01,356 Speaker 1: where did the robot go, how much pressure did it apply, 519 00:27:01,636 --> 00:27:04,796 Speaker 1: and what was the resulting geometry? That can start building 520 00:27:04,796 --> 00:27:07,676 Speaker 1: a model that says that correlates the inputs to the outputs, 521 00:27:07,996 --> 00:27:09,996 Speaker 1: and I can explore this and say, okay, in order 522 00:27:10,036 --> 00:27:12,436 Speaker 1: to get to the right output, I need these inputs. 523 00:27:12,836 --> 00:27:15,076 Speaker 1: But we didn't have them in the beginning. So the 524 00:27:15,116 --> 00:27:18,076 Speaker 1: idea was two things. One was maybe we can simulate 525 00:27:18,116 --> 00:27:22,156 Speaker 1: the data right and very early on we started doing 526 00:27:22,156 --> 00:27:24,876 Speaker 1: some simulation, physics based simulation, and we soon realized in 527 00:27:24,956 --> 00:27:28,076 Speaker 1: order to get an accurate result, the simulations are going 528 00:27:28,156 --> 00:27:31,956 Speaker 1: to be very computationally intensive. A simulation of a part 529 00:27:31,996 --> 00:27:35,636 Speaker 1: that took only fifteen minutes to form took us one 530 00:27:35,676 --> 00:27:39,236 Speaker 1: week on twenty seven core machine. Wow. Right, so okay, 531 00:27:39,476 --> 00:27:42,996 Speaker 1: stimulation not only is not accurate, it takes forever. So 532 00:27:43,036 --> 00:27:46,076 Speaker 1: we realized, okay, so that's not the right route. The 533 00:27:46,156 --> 00:27:48,316 Speaker 1: right route was like, Okay, we can also form a 534 00:27:48,356 --> 00:27:51,516 Speaker 1: lot of parts and gather the data. But in order 535 00:27:51,556 --> 00:27:53,236 Speaker 1: to do that we go back to that same problem. 536 00:27:53,276 --> 00:27:54,876 Speaker 1: We need to have a scale. We need to have 537 00:27:54,916 --> 00:27:56,836 Speaker 1: a lot of these machines for these parts and get 538 00:27:56,836 --> 00:27:57,276 Speaker 1: that data. 539 00:27:57,316 --> 00:27:59,716 Speaker 2: I mean, one of the big AI insights of the 540 00:27:59,796 --> 00:28:03,556 Speaker 2: last whatever decade is like, you need a ton of data, 541 00:28:03,596 --> 00:28:06,996 Speaker 2: which is easy if it's words, but hard if it's metal. Right. 542 00:28:07,116 --> 00:28:11,596 Speaker 1: Yes, we ended up doing was created a hybrid model. 543 00:28:11,836 --> 00:28:15,276 Speaker 1: We said, okay, what if we keep the humans in 544 00:28:15,316 --> 00:28:18,196 Speaker 1: the loop, so the human can give an instruction initially 545 00:28:18,236 --> 00:28:21,836 Speaker 1: based on herostricts, and then we look at the data 546 00:28:21,916 --> 00:28:25,636 Speaker 1: and human can adjust and then iterate on that. But 547 00:28:25,796 --> 00:28:29,036 Speaker 1: while we are capturing all these data, and over time, 548 00:28:29,196 --> 00:28:31,556 Speaker 1: as we're capturing the data, we start building the models 549 00:28:31,876 --> 00:28:35,716 Speaker 1: that will help the human do less trials. Right. It's 550 00:28:35,716 --> 00:28:38,516 Speaker 1: basically guided reinforcement learning, right, and a humans are actually 551 00:28:38,516 --> 00:28:41,436 Speaker 1: guiding it where to go, but it's exploring those areas 552 00:28:41,516 --> 00:28:43,916 Speaker 1: but after a while, once we started forming south thousands 553 00:28:43,916 --> 00:28:47,236 Speaker 1: of parts, then you can start feeding this data into model. 554 00:28:47,236 --> 00:28:50,756 Speaker 1: Then the model will be like, okay, human, you don't 555 00:28:50,756 --> 00:28:52,676 Speaker 1: need to do twenty five different trials. Now you can 556 00:28:52,716 --> 00:28:54,236 Speaker 1: do with five trials, you're going to get to the 557 00:28:54,276 --> 00:28:56,356 Speaker 1: right place, which is actually the number we are at 558 00:28:56,436 --> 00:28:56,876 Speaker 1: right now. 559 00:28:56,956 --> 00:29:00,396 Speaker 2: And that's happening in the physical world largely those iterations 560 00:29:00,436 --> 00:29:02,636 Speaker 2: like you're trying a piece of metal and it's bad 561 00:29:02,676 --> 00:29:04,516 Speaker 2: and it tears, and you do another piece of metal 562 00:29:04,516 --> 00:29:06,556 Speaker 2: and it's a little less bad and eventually. 563 00:29:06,196 --> 00:29:09,836 Speaker 1: Exactly exactly, and that initially would take twenty five parts, 564 00:29:10,476 --> 00:29:13,156 Speaker 1: like you know, before we find a recipe for that design. 565 00:29:13,716 --> 00:29:16,396 Speaker 1: But twenty five parts still was better than traditional alternative. 566 00:29:16,516 --> 00:29:18,356 Speaker 2: When you say twenty five parts, I mean twenty five 567 00:29:18,396 --> 00:29:21,196 Speaker 2: tries twenty five pieces of metal before you make the 568 00:29:21,236 --> 00:29:22,916 Speaker 2: part the right way exactly. 569 00:29:23,356 --> 00:29:25,556 Speaker 1: And that was like, you know, they would sit down 570 00:29:25,596 --> 00:29:27,316 Speaker 1: basically twenty five days in a row, so in a 571 00:29:27,356 --> 00:29:30,796 Speaker 1: month they could actually define a recipe where traditionally making 572 00:29:30,796 --> 00:29:32,716 Speaker 1: a mold would take at least three four months. Right, 573 00:29:32,756 --> 00:29:36,156 Speaker 1: So we were still better. But then now with over time, 574 00:29:36,196 --> 00:29:38,716 Speaker 1: when we generated the data and now the model can 575 00:29:38,796 --> 00:29:43,116 Speaker 1: tell the engineer, okay, maybe you want to choose these parameters, 576 00:29:43,276 --> 00:29:46,396 Speaker 1: is now becoming an advisor with down to five trials. 577 00:29:46,556 --> 00:29:48,436 Speaker 1: In five trials, we can actually get to the right 578 00:29:48,476 --> 00:29:50,756 Speaker 1: part and then hopefully in the future we get to 579 00:29:50,796 --> 00:29:53,076 Speaker 1: a point where you know, the machine will tell the 580 00:29:53,156 --> 00:29:55,316 Speaker 1: romans what to do and the human can be completely 581 00:29:55,316 --> 00:29:57,676 Speaker 1: out of the loop. Yeah. But the idea was like, 582 00:29:57,716 --> 00:29:59,556 Speaker 1: how do you kind of create that hybrid model that's 583 00:29:59,556 --> 00:30:03,116 Speaker 1: efficient so that we can generate the data until the 584 00:30:03,156 --> 00:30:05,596 Speaker 1: model is good enough to do the job itself. 585 00:30:05,956 --> 00:30:09,076 Speaker 2: And you find that the data is sort of generalizable, 586 00:30:09,436 --> 00:30:11,996 Speaker 2: I mean clearly, like making one kind of part makes 587 00:30:12,076 --> 00:30:16,076 Speaker 2: the model the AI smarter about making another kind of part. 588 00:30:16,516 --> 00:30:18,956 Speaker 1: Yes, you know, yeah it is. It's kind of interesting. 589 00:30:18,996 --> 00:30:21,236 Speaker 1: I think people don't think about it. I used to 590 00:30:21,316 --> 00:30:23,796 Speaker 1: do sheet shaping by hand, right, That was one of 591 00:30:23,796 --> 00:30:25,956 Speaker 1: the hobbies I had. I was working with this shop 592 00:30:25,956 --> 00:30:29,036 Speaker 1: in Pomona that we were actually hammer sheets into shape, 593 00:30:29,356 --> 00:30:31,036 Speaker 1: and we used to say, you know, if you spent 594 00:30:31,156 --> 00:30:33,516 Speaker 1: five years doing it, you're really good. You get really 595 00:30:33,516 --> 00:30:36,236 Speaker 1: good at it. I was used to think, you know, okay, 596 00:30:36,236 --> 00:30:38,356 Speaker 1: after five years of doing this, yes, you have this 597 00:30:38,396 --> 00:30:40,716 Speaker 1: intuitive understanding of you look at the sheet and be like, okay, 598 00:30:40,756 --> 00:30:42,596 Speaker 1: this this place needs to be hammered more. This place 599 00:30:42,636 --> 00:30:44,836 Speaker 1: needs to be hammered. It was, it was, it was. 600 00:30:44,876 --> 00:30:46,876 Speaker 1: It was intuitive. It was like you couldn't explains why 601 00:30:46,876 --> 00:30:49,636 Speaker 1: you're thinking this need to happen. There was no physical explanation. 602 00:30:49,996 --> 00:30:52,356 Speaker 1: None of these people who were she shaping got PhDs 603 00:30:52,356 --> 00:30:55,236 Speaker 1: in material science. Yeah, they just learned over time seeing 604 00:30:55,236 --> 00:30:58,436 Speaker 1: the pattern of how the sheet formed. Yes, craftsmanship, that's 605 00:30:58,516 --> 00:31:02,036 Speaker 1: craftsmanship right. Yeah, but really reminded me of Okay, these 606 00:31:02,036 --> 00:31:04,716 Speaker 1: people can know how to do it, but without really 607 00:31:04,716 --> 00:31:07,276 Speaker 1: being able to explain it, to do it for five years. 608 00:31:07,316 --> 00:31:09,076 Speaker 2: It's that kind of tacit knowledge. 609 00:31:09,316 --> 00:31:13,236 Speaker 1: Yeah, and reminded me of the same challenge we had 610 00:31:14,116 --> 00:31:16,716 Speaker 1: early machine learning challenge where they were like, okay, a 611 00:31:16,796 --> 00:31:19,476 Speaker 1: human can look at two pictures and say, okay, this 612 00:31:19,556 --> 00:31:21,916 Speaker 1: is a cat and this is a dog. Something happens 613 00:31:21,916 --> 00:31:23,436 Speaker 1: in their brain that knows which is a cat, but 614 00:31:23,436 --> 00:31:25,836 Speaker 1: they cannot really define why they're calling this cat and 615 00:31:25,916 --> 00:31:28,036 Speaker 1: this sort of dog. So that was where it starts 616 00:31:28,036 --> 00:31:29,996 Speaker 1: to click for me. If I can capture enough data, 617 00:31:30,356 --> 00:31:34,476 Speaker 1: five years worth of data right of a human, then 618 00:31:34,556 --> 00:31:36,556 Speaker 1: I should be able to get to a very good 619 00:31:36,636 --> 00:31:40,036 Speaker 1: sheet shaper, right, And you know it's funny. Back at 620 00:31:40,036 --> 00:31:42,116 Speaker 1: the end, I was like, okay, humans are you know, 621 00:31:42,156 --> 00:31:45,076 Speaker 1: receiving x amount of megabytes a second? Okay, how five 622 00:31:45,116 --> 00:31:47,796 Speaker 1: years worse of data? Is that much? So roughly, I 623 00:31:47,796 --> 00:31:50,236 Speaker 1: think once we get a certain amount of data, I 624 00:31:50,236 --> 00:31:53,636 Speaker 1: think we have enough data to be able to basically 625 00:31:53,676 --> 00:31:56,596 Speaker 1: replace a like not replace replace the mentality or the 626 00:31:56,636 --> 00:31:58,316 Speaker 1: model that the sheet shaper has in their mind. 627 00:31:59,236 --> 00:32:02,716 Speaker 2: So how how how many years of kind of human 628 00:32:02,956 --> 00:32:06,556 Speaker 2: level craftsmen sheet shaping data does the model have at 629 00:32:06,556 --> 00:32:07,076 Speaker 2: this point? 630 00:32:07,316 --> 00:32:09,556 Speaker 1: Yeah? No, so I think rastam I check? One year ago? 631 00:32:09,596 --> 00:32:11,436 Speaker 1: I checked around, we were like three fourths of the 632 00:32:11,436 --> 00:32:13,076 Speaker 1: way there in terms of the data that we have 633 00:32:14,236 --> 00:32:16,036 Speaker 1: for just she shaping. Right. So once we get to 634 00:32:16,276 --> 00:32:17,996 Speaker 1: I think full, I think and these at that point 635 00:32:18,036 --> 00:32:19,916 Speaker 1: we have no excuse. We have enough data. The model 636 00:32:19,916 --> 00:32:21,636 Speaker 1: should be good. We just need to figure out how 637 00:32:22,476 --> 00:32:24,516 Speaker 1: why it's not. Maybe far from it is. 638 00:32:24,876 --> 00:32:29,836 Speaker 2: It is interesting to analogize it to like human craftsmanship, right, 639 00:32:29,876 --> 00:32:31,676 Speaker 2: And I mean even if you want to zoom out 640 00:32:31,716 --> 00:32:34,756 Speaker 2: even more, the like fifty year history of AI, where 641 00:32:34,796 --> 00:32:36,716 Speaker 2: first everybody was like, oh, you just got to teach 642 00:32:36,716 --> 00:32:39,396 Speaker 2: the machine all the rules for to use your example, 643 00:32:39,476 --> 00:32:41,076 Speaker 2: like what's a cat and what's a dog? But then 644 00:32:41,116 --> 00:32:43,916 Speaker 2: you realize it's actually wildly hard to make a list 645 00:32:43,956 --> 00:32:47,436 Speaker 2: of rules that can reliably distinguish a cat from a dog. 646 00:32:48,116 --> 00:32:51,956 Speaker 2: And the weird thing that has happened in AI is like, oh, 647 00:32:51,996 --> 00:32:55,756 Speaker 2: you don't actually have to make a list. You just 648 00:32:55,836 --> 00:32:58,596 Speaker 2: need like image that you just need like a giant 649 00:32:58,636 --> 00:33:03,156 Speaker 2: database of images and a giant neural network and you 650 00:33:03,276 --> 00:33:06,276 Speaker 2: just throw it at it like and say figure it out, 651 00:33:06,316 --> 00:33:09,036 Speaker 2: and it figures it out, and you're sort of doing that. 652 00:33:09,436 --> 00:33:11,396 Speaker 2: But for shaping metal. 653 00:33:11,356 --> 00:33:13,436 Speaker 1: For metal, and then the only challenge was, like you know, 654 00:33:13,636 --> 00:33:17,636 Speaker 1: cats and dogs pictures were Internet and sheet metal forming 655 00:33:17,716 --> 00:33:20,196 Speaker 1: data wasn't. And so that's that was an additional problem 656 00:33:20,236 --> 00:33:21,636 Speaker 1: we have to solve, as you pointed out, which is 657 00:33:21,676 --> 00:33:23,196 Speaker 1: a big problem in physical AI. 658 00:33:23,516 --> 00:33:25,116 Speaker 2: So I want to talk a little bit more about 659 00:33:26,236 --> 00:33:31,596 Speaker 2: AI and robotics. Jansen Wong has been talking about it, 660 00:33:31,676 --> 00:33:34,796 Speaker 2: as I'm sure you know in video and videos vc 661 00:33:35,116 --> 00:33:38,196 Speaker 2: ARM as an investor in your company, other people are 662 00:33:38,236 --> 00:33:40,356 Speaker 2: working on what you're working on. I mean, I'm curious 663 00:33:41,076 --> 00:33:43,356 Speaker 2: what does the sort of AI and robotics path look 664 00:33:43,476 --> 00:33:45,396 Speaker 2: like to you? For the next few years, and what like, 665 00:33:45,916 --> 00:33:48,116 Speaker 2: what do you understand about it now that you didn't 666 00:33:48,196 --> 00:33:51,036 Speaker 2: understand whatever five years ago? Like what what have you 667 00:33:51,116 --> 00:33:53,916 Speaker 2: really come to realize by working on it all the time? 668 00:33:54,516 --> 00:33:57,396 Speaker 1: I think the biggest problem for physical AI is data 669 00:33:57,436 --> 00:34:01,436 Speaker 1: generation for now to train models. So we need to 670 00:34:01,596 --> 00:34:04,076 Speaker 1: either there's two things need to happen. Either new types 671 00:34:04,116 --> 00:34:07,436 Speaker 1: of models needs to be created, new architectures, new new 672 00:34:07,476 --> 00:34:10,996 Speaker 1: algorithms basically, which I'm sure it's going to happen that 673 00:34:11,116 --> 00:34:15,476 Speaker 1: can learn more with less data basically and the same 674 00:34:15,516 --> 00:34:17,636 Speaker 1: way humans kind of learn more with less data. Right. 675 00:34:18,156 --> 00:34:20,916 Speaker 1: But at the same time, I think, you know, we 676 00:34:21,156 --> 00:34:25,356 Speaker 1: only exposed our models to categorically to ten percent of 677 00:34:25,516 --> 00:34:28,756 Speaker 1: type of data that humans receive. You know, you think 678 00:34:28,796 --> 00:34:32,476 Speaker 1: about you know, human intrictions. You and I are now talking, 679 00:34:33,196 --> 00:34:35,356 Speaker 1: if it was AI, AI is probably only listening to 680 00:34:35,396 --> 00:34:38,716 Speaker 1: the words we're saying, right, But that's only ten percent 681 00:34:38,716 --> 00:34:41,076 Speaker 1: of communication. I can see your lips moving, I can 682 00:34:41,116 --> 00:34:43,596 Speaker 1: see your eyebrows moving. I can see like maybe you're 683 00:34:43,596 --> 00:34:45,876 Speaker 1: folding your arms and okay, I know that like okay, 684 00:34:45,876 --> 00:34:48,596 Speaker 1: maybe there's all these ninety percent of the signals are 685 00:34:48,596 --> 00:34:52,116 Speaker 1: not captured. That that's used for learning. You know, you 686 00:34:52,156 --> 00:34:56,316 Speaker 1: look at if you ask chat GPT or Dolly or 687 00:34:56,596 --> 00:34:58,636 Speaker 1: you know any of the you know, even even you 688 00:34:58,676 --> 00:35:02,636 Speaker 1: know Grock say okay, draw me a clock that is 689 00:35:02,796 --> 00:35:06,196 Speaker 1: shows five thirty. It cannot show you draw you a clock. 690 00:35:06,236 --> 00:35:07,836 Speaker 1: It will draw you a clock, but it doesn't show 691 00:35:07,876 --> 00:35:10,516 Speaker 1: five thirty. Actually, most the time it shows ten ten. 692 00:35:10,876 --> 00:35:14,436 Speaker 2: Ten ten, because that's where watchhands, like analog watchhands look good. 693 00:35:14,516 --> 00:35:17,196 Speaker 1: Right, it's a nice little v because those are all 694 00:35:17,236 --> 00:35:19,516 Speaker 1: the images that they're seen on internet because they watch it. 695 00:35:19,516 --> 00:35:22,156 Speaker 2: It's almost always ten ten. It's the classic watch photo. 696 00:35:22,556 --> 00:35:24,636 Speaker 1: It's like five thirty is also ten ten, because. 697 00:35:24,516 --> 00:35:28,956 Speaker 2: It's always ten ten right to a generative AI, it's 698 00:35:28,996 --> 00:35:30,396 Speaker 2: always ten ten somewhere. 699 00:35:30,716 --> 00:35:33,316 Speaker 1: So I think, but that humans, you know, receive this 700 00:35:33,476 --> 00:35:35,516 Speaker 1: data of movement. When you grow up you look at 701 00:35:35,516 --> 00:35:37,796 Speaker 1: the clock on the wall as a kid, you're like, okay, 702 00:35:37,836 --> 00:35:39,636 Speaker 1: now I intuitively get it. I think I know what's 703 00:35:39,676 --> 00:35:41,596 Speaker 1: going on, so I can actually make it work. So 704 00:35:42,476 --> 00:35:44,116 Speaker 1: even though we train it a lot of data, I 705 00:35:44,156 --> 00:35:46,796 Speaker 1: don't think we trained it on the right categorically right 706 00:35:46,876 --> 00:35:50,836 Speaker 1: data yet right to get all the intuitive understanding that 707 00:35:50,916 --> 00:35:53,756 Speaker 1: we have today. So I think we have a data 708 00:35:53,756 --> 00:35:56,356 Speaker 1: problem and that exists the physical AI. So I think 709 00:35:56,396 --> 00:35:58,516 Speaker 1: the applications will win. There's a lot of people are 710 00:35:58,516 --> 00:36:01,076 Speaker 1: working in this. I think the applications will win who 711 00:36:01,116 --> 00:36:04,916 Speaker 1: can either synthetically generate that data or they can actually 712 00:36:05,236 --> 00:36:07,916 Speaker 1: scale in the physical world in a way where they 713 00:36:07,916 --> 00:36:12,076 Speaker 1: can actually generate the day for themselves. But the scaling 714 00:36:12,116 --> 00:36:14,836 Speaker 1: needs to happen with less data, and I think that 715 00:36:14,916 --> 00:36:17,196 Speaker 1: was That's why I'm like, for example, like very bullish 716 00:36:17,196 --> 00:36:19,876 Speaker 1: on manufacturing. So I think the data is going to 717 00:36:19,876 --> 00:36:21,636 Speaker 1: be the biggest challenge. And I think, you know, in 718 00:36:21,716 --> 00:36:25,356 Speaker 1: order for us to massively change this space, we need 719 00:36:25,396 --> 00:36:26,876 Speaker 1: to be able to get to the data. I don't 720 00:36:26,916 --> 00:36:30,756 Speaker 1: think algorithms is a bottleneck there yet. It's just a 721 00:36:30,836 --> 00:36:31,476 Speaker 1: data for us. 722 00:36:31,516 --> 00:36:36,316 Speaker 2: And is it just a matter of people doing what 723 00:36:36,356 --> 00:36:39,636 Speaker 2: you're doing and like finding little wedge places to start 724 00:36:39,676 --> 00:36:41,676 Speaker 2: and having people sort of hold the hand of the 725 00:36:41,716 --> 00:36:46,276 Speaker 2: model and training up the models. I mean that seems 726 00:36:46,316 --> 00:36:48,876 Speaker 2: slow on a certain level, like not you know, obviously 727 00:36:48,956 --> 00:36:52,236 Speaker 2: it's working for you, but like, is there some kind 728 00:36:52,236 --> 00:36:55,316 Speaker 2: of breakthrough move people can make? Can you put sensors 729 00:36:55,356 --> 00:36:58,396 Speaker 2: somewhere in the world to you know, train AI without 730 00:36:58,476 --> 00:37:01,516 Speaker 2: having to you know, have a human stand next to 731 00:37:01,596 --> 00:37:04,276 Speaker 2: it as it messes up one piece of sheet metal 732 00:37:04,316 --> 00:37:04,836 Speaker 2: after another. 733 00:37:05,076 --> 00:37:09,356 Speaker 1: Yeah, I think I think that there is there's another path, 734 00:37:09,396 --> 00:37:15,156 Speaker 1: which is simulation path. Make physics based simulations faster and 735 00:37:15,356 --> 00:37:17,316 Speaker 1: kind of learn. Let the robots just go play in 736 00:37:17,356 --> 00:37:19,956 Speaker 1: a digital playground as opposed to deploy it in real role, 737 00:37:20,236 --> 00:37:22,276 Speaker 1: and that becomes a computation problem. And then you know, 738 00:37:22,276 --> 00:37:23,956 Speaker 1: as long as you have enough computation, you can train 739 00:37:24,036 --> 00:37:26,596 Speaker 1: to robots. But I think, you know, I think you 740 00:37:26,636 --> 00:37:28,756 Speaker 1: know the good examples that we have had such success 741 00:37:28,796 --> 00:37:31,796 Speaker 1: so far as like autonomous cars, right, did the same 742 00:37:31,836 --> 00:37:34,196 Speaker 1: thing we were doing, but in the car like Okay, Tesla, 743 00:37:35,276 --> 00:37:37,676 Speaker 1: you know, deploy the fleet of robots that are capturing 744 00:37:37,756 --> 00:37:40,716 Speaker 1: data still be driven by humans, but the data can 745 00:37:40,716 --> 00:37:42,396 Speaker 1: be used later on to kind of automate it. 746 00:37:42,916 --> 00:37:46,356 Speaker 2: I mean, that's an interesting case because it has been 747 00:37:46,556 --> 00:37:51,636 Speaker 2: much harder clearly than many people thought. Maybe most people thought, right, Like, 748 00:37:51,916 --> 00:37:54,476 Speaker 2: I know, that's a particular instance where you're really worried 749 00:37:54,476 --> 00:37:59,116 Speaker 2: about edge cases. I don't know, is autonomous cars like 750 00:37:59,276 --> 00:38:01,196 Speaker 2: a good model or not. It seems complicated. 751 00:38:01,196 --> 00:38:03,596 Speaker 1: I think the model of capturing data is there, but 752 00:38:03,636 --> 00:38:06,716 Speaker 1: then the the task at hand is very hard. Yeah, right, 753 00:38:07,236 --> 00:38:10,436 Speaker 1: so I think that's the challenge, right so where it 754 00:38:10,436 --> 00:38:13,476 Speaker 1: says like with us, it's still much more structured environment, 755 00:38:13,556 --> 00:38:15,236 Speaker 1: And I think that's that was the thinking we're thinking. 756 00:38:15,236 --> 00:38:17,756 Speaker 1: I think the hardest problem right now in physical AI 757 00:38:17,956 --> 00:38:20,676 Speaker 1: is finding the business model of how do you scale 758 00:38:20,756 --> 00:38:24,316 Speaker 1: data capture without requiring billions of dollars in investment? 759 00:38:24,556 --> 00:38:25,916 Speaker 2: So what do you make in today? 760 00:38:26,236 --> 00:38:29,676 Speaker 1: I imagine you know, so last time I checked in 761 00:38:29,716 --> 00:38:32,916 Speaker 1: the facility, one four of the sales are working on 762 00:38:32,956 --> 00:38:33,996 Speaker 1: a defense application. 763 00:38:35,436 --> 00:38:37,316 Speaker 2: Is it secret? Can you tell me what it is? 764 00:38:38,076 --> 00:38:41,876 Speaker 1: It's a missile? And two of them were working on 765 00:38:41,876 --> 00:38:45,076 Speaker 1: an aerospace application. This is components of an aircraft or 766 00:38:45,076 --> 00:38:48,716 Speaker 1: a drawing. And one of them, as an interesting one, 767 00:38:48,796 --> 00:38:51,756 Speaker 1: was working on an architectural component, which is a roof 768 00:38:51,836 --> 00:38:56,756 Speaker 1: tile for a specific building that's used by the Department 769 00:38:56,796 --> 00:38:58,716 Speaker 1: of the by Bureau of Water Recognition. 770 00:38:58,916 --> 00:39:00,436 Speaker 2: Oh, I was I was going to say, what is it? 771 00:39:00,516 --> 00:39:04,676 Speaker 2: Something like Frank Gary, like nightmare weirdo metal park. 772 00:39:05,516 --> 00:39:07,396 Speaker 1: Oh, those we have had those in the past two 773 00:39:07,476 --> 00:39:10,756 Speaker 1: but this one is actually very practical. Well, it's this building. 774 00:39:10,756 --> 00:39:14,036 Speaker 1: It's actually very interesting. Exactly these buildings, these large industrial 775 00:39:14,076 --> 00:39:16,996 Speaker 1: buildings that built they built in the sixties or fifties, 776 00:39:17,636 --> 00:39:19,716 Speaker 1: and they use these type of roof tiles that the 777 00:39:19,796 --> 00:39:23,596 Speaker 1: manufacturer doesn't exist anymore. And anybody else who they went 778 00:39:23,636 --> 00:39:26,916 Speaker 1: to quoted them hundreds of thousand dollars to make those tiles, 779 00:39:26,916 --> 00:39:28,276 Speaker 1: and we're like, oh no, we can make it for you. 780 00:39:29,476 --> 00:39:31,156 Speaker 1: But also that show is kind of the diversity. I mean, 781 00:39:31,156 --> 00:39:32,476 Speaker 1: like like I say, in the morning, we have like 782 00:39:32,476 --> 00:39:37,716 Speaker 1: aerospace parts. In the afternoon, roof tiles for a industrial complex, 783 00:39:37,796 --> 00:39:39,636 Speaker 1: for you know, for a dam. 784 00:39:40,076 --> 00:39:43,596 Speaker 2: Now you're in the sheet metal business. I know you're large. 785 00:39:43,676 --> 00:39:47,676 Speaker 2: Dream is much larger than that, right, but like what 786 00:39:48,596 --> 00:39:51,996 Speaker 2: like that, tell me where you are now? Tell me 787 00:39:51,996 --> 00:39:53,436 Speaker 2: where you are now? Like what are you doing? What 788 00:39:53,436 --> 00:39:55,996 Speaker 2: are you selling? And then kind of what's the next 789 00:39:56,396 --> 00:39:56,996 Speaker 2: big step. 790 00:39:57,396 --> 00:39:59,396 Speaker 1: So some of our systems are now operating out in 791 00:39:59,396 --> 00:40:03,996 Speaker 1: the wild and working for the customers. And but I 792 00:40:03,996 --> 00:40:07,556 Speaker 1: think the next phase of growth for us is getting 793 00:40:07,636 --> 00:40:09,996 Speaker 1: into each of these applications and own more of the 794 00:40:10,076 --> 00:40:13,276 Speaker 1: process so we can teach the robocraftsmen the future processes 795 00:40:13,356 --> 00:40:15,716 Speaker 1: not just sheet for me, but also maybe how to 796 00:40:15,716 --> 00:40:17,756 Speaker 1: assemble it, how to weld it, how do you surface 797 00:40:17,756 --> 00:40:21,076 Speaker 1: finish it right. So what we are doing now in 798 00:40:21,116 --> 00:40:24,716 Speaker 1: the next phase is actually instead of selling parts or 799 00:40:24,716 --> 00:40:29,156 Speaker 1: components or systems, we're actually saying, Okay, can we get 800 00:40:29,196 --> 00:40:33,796 Speaker 1: this robocraftsman to actually build you a subassembly or a 801 00:40:33,836 --> 00:40:36,756 Speaker 1: full product, not just a component of it, but a 802 00:40:36,796 --> 00:40:39,596 Speaker 1: full product. So that's something we're describing with folks. Can 803 00:40:39,596 --> 00:40:43,676 Speaker 1: we have the robocraftsmen build the full drone for you? 804 00:40:43,716 --> 00:40:45,636 Speaker 1: Can we have the robocrafts and build you a full 805 00:40:45,636 --> 00:40:49,556 Speaker 1: missile as opposed to just build missile you know missile scans. 806 00:40:49,876 --> 00:40:53,316 Speaker 2: Is there that seems like a leap? Is there not 807 00:40:53,396 --> 00:40:54,756 Speaker 2: an intermediate step? 808 00:40:55,276 --> 00:40:58,596 Speaker 1: Like yes? Yes? So I mean how we're doing it 809 00:40:58,676 --> 00:41:01,156 Speaker 1: is we're gradually stepping into it right the same way 810 00:41:01,196 --> 00:41:03,596 Speaker 1: she metal was our first application. So we're putting a 811 00:41:03,636 --> 00:41:08,196 Speaker 1: facility that maybe makes drones, but the main component that 812 00:41:08,236 --> 00:41:11,916 Speaker 1: we automate today is sheet for me, which is the bottleneck. 813 00:41:12,436 --> 00:41:14,756 Speaker 1: And then we do the welding in a traditional way 814 00:41:14,916 --> 00:41:17,156 Speaker 1: on the same robots, but we actually instruct them to 815 00:41:17,196 --> 00:41:19,716 Speaker 1: do it. 816 00:41:18,356 --> 00:41:21,716 Speaker 2: So that way, the robot is kind of back where 817 00:41:21,716 --> 00:41:24,196 Speaker 2: it was on sheet metal five years ago, but it's 818 00:41:24,276 --> 00:41:26,276 Speaker 2: learning how to weld now exactly. 819 00:41:26,716 --> 00:41:29,076 Speaker 1: I used to work in a you know, a shop 820 00:41:29,116 --> 00:41:31,636 Speaker 1: that we will do custom cars, build the custom cars 821 00:41:31,636 --> 00:41:33,836 Speaker 1: with hand, and so it was also near near and 822 00:41:33,836 --> 00:41:37,556 Speaker 1: dear to my heart. So what we realize is that 823 00:41:37,556 --> 00:41:41,556 Speaker 1: with our technology, for the first time, we can actually 824 00:41:41,676 --> 00:41:45,156 Speaker 1: enable a product that didn't exist in automotive, meaning that 825 00:41:45,676 --> 00:41:48,516 Speaker 1: instead of buying a car that's mass produced and every 826 00:41:48,516 --> 00:41:50,796 Speaker 1: single one of them look the same, you can now 827 00:41:50,956 --> 00:41:54,236 Speaker 1: let the customer design a custom car for them. You know, 828 00:41:54,316 --> 00:41:55,716 Speaker 1: right now, if you go buy a car, you can 829 00:41:55,796 --> 00:41:58,396 Speaker 1: you have options of what the what the seat color 830 00:41:58,396 --> 00:42:01,276 Speaker 1: would be, or maybe the color of the car would be, 831 00:42:01,316 --> 00:42:04,116 Speaker 1: and what some trim options. But you can't really choose 832 00:42:04,156 --> 00:42:06,396 Speaker 1: the design of your car. You can't say, oh, I 833 00:42:06,396 --> 00:42:08,516 Speaker 1: want a different hood and I want a different fender 834 00:42:09,236 --> 00:42:11,716 Speaker 1: because going to the back same problem you have to 835 00:42:11,716 --> 00:42:15,756 Speaker 1: make tooling and mold for the vender of certain designs. 836 00:42:15,756 --> 00:42:18,356 Speaker 1: It cannot easily change it. So with our technology you can. 837 00:42:18,556 --> 00:42:22,796 Speaker 1: So what we started doing was like, okay, applying this 838 00:42:23,116 --> 00:42:26,636 Speaker 1: freedom that this technology provides to now automotive is the 839 00:42:26,796 --> 00:42:28,756 Speaker 1: ability of the customers to be able to go to 840 00:42:28,796 --> 00:42:33,116 Speaker 1: a website design a fully customized car for themselves. It 841 00:42:33,196 --> 00:42:36,076 Speaker 1: can be either from already design panels round car designer 842 00:42:36,476 --> 00:42:40,236 Speaker 1: or adding a specific customer customizations they want to do, 843 00:42:40,316 --> 00:42:42,396 Speaker 1: for example, logo of their company to their door of 844 00:42:42,436 --> 00:42:44,636 Speaker 1: the car or the hood of the car, and actually 845 00:42:44,636 --> 00:42:49,116 Speaker 1: get a completely unique car right manufactured for them. And 846 00:42:49,156 --> 00:42:52,036 Speaker 1: we're actually working with this with some of our automotive 847 00:42:52,036 --> 00:42:55,516 Speaker 1: partners Automotive Aims as well. Right, we actually showed some 848 00:42:55,556 --> 00:42:58,556 Speaker 1: of this work in the biggest aftermarket show in the 849 00:42:58,636 --> 00:43:02,116 Speaker 1: United States is called SEMA with our partner Toyota. So 850 00:43:02,596 --> 00:43:05,156 Speaker 1: I think this is going to be, in my opinion, 851 00:43:05,476 --> 00:43:08,356 Speaker 1: one of the new product categories in automotive. We have 852 00:43:08,476 --> 00:43:10,996 Speaker 1: had a time of his cars, we have had you know, 853 00:43:11,156 --> 00:43:13,596 Speaker 1: electric cars, and I think now for the first time, 854 00:43:13,636 --> 00:43:17,116 Speaker 1: with technologies like ours, you can have custom to order cars, 855 00:43:17,276 --> 00:43:19,636 Speaker 1: like cars that are like, you know, the same way 856 00:43:19,676 --> 00:43:21,316 Speaker 1: you choose what T shirt you wear and your T 857 00:43:21,436 --> 00:43:23,556 Speaker 1: shirt is different than mine. We also don't have to 858 00:43:23,636 --> 00:43:26,356 Speaker 1: drive the same you know model S or you know 859 00:43:26,476 --> 00:43:29,076 Speaker 1: Model three. We can actually have our own customized Model 860 00:43:29,076 --> 00:43:30,036 Speaker 1: three is and modelesses. 861 00:43:30,676 --> 00:43:33,276 Speaker 2: So what's the I mean is that the if you 862 00:43:33,316 --> 00:43:36,636 Speaker 2: think sort of long term for Makeina is like that 863 00:43:36,676 --> 00:43:38,676 Speaker 2: what you think about like give me the give me 864 00:43:38,716 --> 00:43:43,196 Speaker 2: the five year vision yeah, or ten year or whatever. 865 00:43:43,556 --> 00:43:46,876 Speaker 1: Yeah. So I think the long term motivation behind our 866 00:43:46,876 --> 00:43:51,996 Speaker 1: company is can you grant this democratization of ideas for 867 00:43:52,036 --> 00:43:54,116 Speaker 1: people who want to build anything? Right? Can I express 868 00:43:54,156 --> 00:43:56,956 Speaker 1: myself if I'm a builder, can I go build something 869 00:43:56,956 --> 00:43:58,916 Speaker 1: with not having to build a factory for So that's 870 00:43:58,996 --> 00:44:02,836 Speaker 1: really the long term goal. So I imagine in the 871 00:44:02,876 --> 00:44:07,356 Speaker 1: next five to ten years, you can as a designer, 872 00:44:07,396 --> 00:44:09,596 Speaker 1: somebody who has an idea, you can go to a website, 873 00:44:09,916 --> 00:44:13,996 Speaker 1: get guided through your ideas on how to make and 874 00:44:14,076 --> 00:44:17,156 Speaker 1: design a physical product, hit a button and say, okay, 875 00:44:17,196 --> 00:44:22,676 Speaker 1: I want twenty of these, and I want in Chatsworth, California, 876 00:44:22,716 --> 00:44:25,596 Speaker 1: and the right facility programs the right number of robots 877 00:44:25,876 --> 00:44:28,916 Speaker 1: to actually do those operations without any hardware or investment 878 00:44:29,236 --> 00:44:31,156 Speaker 1: that needs to be made for those of specific parts 879 00:44:31,596 --> 00:44:33,316 Speaker 1: and ship it to you two days later in the 880 00:44:33,396 --> 00:44:37,716 Speaker 1: right location. That is the future we're building towards cars 881 00:44:37,796 --> 00:44:41,076 Speaker 1: is just you know, one of the products that could 882 00:44:41,076 --> 00:44:43,116 Speaker 1: be built. But I imagine that you know this technology 883 00:44:43,156 --> 00:44:45,436 Speaker 1: or technology like these, technologies like these can be used 884 00:44:45,476 --> 00:44:48,276 Speaker 1: to do the myriad of designs. I think the moment 885 00:44:48,476 --> 00:44:52,276 Speaker 1: you open up this possibility of any designs could be 886 00:44:52,316 --> 00:44:54,556 Speaker 1: a reality. I think so many things will be created 887 00:44:54,596 --> 00:44:57,316 Speaker 1: that we're not even thinking of right now. You know 888 00:44:57,396 --> 00:45:00,356 Speaker 1: the fact that we have cars today and they all 889 00:45:00,396 --> 00:45:03,316 Speaker 1: look the same as limitation of technology. But the moment 890 00:45:03,356 --> 00:45:07,156 Speaker 1: you can open up this creativity of turning ideas into 891 00:45:07,156 --> 00:45:11,116 Speaker 1: physical reality without a without an initial investment or huge 892 00:45:11,156 --> 00:45:13,156 Speaker 1: barrier to entry, then I think we're going to have 893 00:45:13,196 --> 00:45:15,836 Speaker 1: all kinds of drones, all kinds of satellites, all kinds 894 00:45:15,876 --> 00:45:18,356 Speaker 1: of rockets, all kinds of cars that you're going to 895 00:45:18,356 --> 00:45:22,076 Speaker 1: be this like you know, Cambrian explosion of different designs 896 00:45:22,116 --> 00:45:23,636 Speaker 1: that's going to come into our world. And I think 897 00:45:23,636 --> 00:45:25,676 Speaker 1: that's what future is about. The future is about you 898 00:45:25,676 --> 00:45:27,196 Speaker 1: know what I call it, Like future is custom Like 899 00:45:27,276 --> 00:45:29,956 Speaker 1: future is about being able to make these all these 900 00:45:29,996 --> 00:45:33,716 Speaker 1: ideas in reality. We had this explosion happening in digital world. Yeah, 901 00:45:33,756 --> 00:45:36,636 Speaker 1: you know, now we have even models generating images and 902 00:45:37,156 --> 00:45:41,116 Speaker 1: videos and there's this you know, explosion of different ideas 903 00:45:41,116 --> 00:45:45,276 Speaker 1: and content being created using the technology. But the link 904 00:45:45,356 --> 00:45:47,636 Speaker 1: is broken to the physical world and the physical work 905 00:45:47,676 --> 00:45:49,596 Speaker 1: is still pretty uniform because it's very hard to make 906 00:45:49,636 --> 00:45:52,756 Speaker 1: things in the physical world. Can we bridge that gap? 907 00:45:52,916 --> 00:45:56,236 Speaker 1: Can we connect the digital world of creation to physical 908 00:45:56,316 --> 00:45:59,396 Speaker 1: world of creation and create the same variety in the 909 00:45:59,396 --> 00:46:01,996 Speaker 1: physical world as we have in the digital world. I 910 00:46:02,036 --> 00:46:03,876 Speaker 1: think that's the goal in our company. 911 00:46:06,956 --> 00:46:08,916 Speaker 2: We'll be back in a minute with the light Year Round. 912 00:46:17,156 --> 00:46:22,516 Speaker 2: Let's finish with the Lightning Round. Do you drive a 913 00:46:22,556 --> 00:46:23,996 Speaker 2: customized car? 914 00:46:24,436 --> 00:46:28,036 Speaker 1: I don't actually yet. Well, if I am, what have I. 915 00:46:27,996 --> 00:46:30,436 Speaker 2: Seen on your Instagram? What's that truck you keep posting 916 00:46:30,476 --> 00:46:31,316 Speaker 2: on your Instagram? 917 00:46:31,356 --> 00:46:33,396 Speaker 1: So I so I have a truck that's customized. I 918 00:46:33,396 --> 00:46:36,636 Speaker 1: don't drive it around as much, but maybe this year 919 00:46:36,676 --> 00:46:38,516 Speaker 1: I'll start taking it out. This year I've been you know, 920 00:46:38,516 --> 00:46:41,556 Speaker 1: we have been kind of stelf about it, talking about it, 921 00:46:41,716 --> 00:46:43,316 Speaker 1: but we haven't talked about it in a big way 922 00:46:43,356 --> 00:46:45,076 Speaker 1: because we have a big release coming soon. 923 00:46:45,556 --> 00:46:48,036 Speaker 2: I mean, you're literally posting it on Instagram. It's not 924 00:46:48,116 --> 00:46:51,116 Speaker 2: that stuff. Tell me about Tell me about that truck 925 00:46:51,156 --> 00:46:53,196 Speaker 2: you keep posting on Instagram? What's going on with that? 926 00:46:53,316 --> 00:46:55,116 Speaker 1: So it's a truck is fully the full body is 927 00:46:55,116 --> 00:46:55,996 Speaker 1: fully customized. 928 00:46:56,396 --> 00:46:58,836 Speaker 2: It says anvil in the back when you post it? 929 00:46:58,876 --> 00:46:59,756 Speaker 2: Is it called anvil? 930 00:46:59,836 --> 00:47:01,916 Speaker 1: Dumb question and call it and call it anvil? I 931 00:47:01,956 --> 00:47:04,476 Speaker 1: think it's the idea was actually the shape design of 932 00:47:04,516 --> 00:47:06,236 Speaker 1: it was inspired by amil. If you look at the 933 00:47:06,236 --> 00:47:09,516 Speaker 1: front fender, it actually looks like a the front bumper 934 00:47:09,556 --> 00:47:11,876 Speaker 1: looks like an anvil. But also the idea is that, like, 935 00:47:11,916 --> 00:47:14,756 Speaker 1: you know, we're actually forming shits on an anvil. So yeah, 936 00:47:14,796 --> 00:47:15,676 Speaker 1: it was very fitting. 937 00:47:15,916 --> 00:47:18,036 Speaker 2: Tell me about that truck, like, tell just tell me 938 00:47:18,076 --> 00:47:18,756 Speaker 2: what's it look like. 939 00:47:18,876 --> 00:47:20,716 Speaker 1: Yeah, so for example, like you know, we put a 940 00:47:20,716 --> 00:47:23,276 Speaker 1: lot of form and sharp edges in the in the hood. Right. 941 00:47:24,276 --> 00:47:27,156 Speaker 1: Most vehicles have a very hard time if you look 942 00:47:27,196 --> 00:47:29,756 Speaker 1: at most of the you know, hood of the vehicles, 943 00:47:30,116 --> 00:47:32,636 Speaker 1: they are very smooth because it's very hard to actually 944 00:47:32,636 --> 00:47:35,436 Speaker 1: put sharp angles in the hood. So if you look 945 00:47:35,476 --> 00:47:37,436 Speaker 1: at this truck, this truck has a lot of angles, 946 00:47:37,436 --> 00:47:40,636 Speaker 1: a lot of sharp detail right in the hood. Right, 947 00:47:41,156 --> 00:47:43,556 Speaker 1: And and and that's very expressive of the type of 948 00:47:43,636 --> 00:47:46,756 Speaker 1: person for example, that I am, right, I like things 949 00:47:46,756 --> 00:47:50,116 Speaker 1: that are edgy, and and that truck is certainly edgy. Right. 950 00:47:51,836 --> 00:47:55,236 Speaker 1: It's bare metal, right, you know, there is no blemishes 951 00:47:55,276 --> 00:47:59,596 Speaker 1: being hiden and hidden under the under the vehicle. You know, 952 00:47:59,636 --> 00:48:02,036 Speaker 1: a lot of people when cyber Truck came out, we 953 00:48:02,236 --> 00:48:05,316 Speaker 1: got very excited about you know, oh, it's bare metal. 954 00:48:05,356 --> 00:48:06,596 Speaker 1: It looks like a metal, but then there was no 955 00:48:06,676 --> 00:48:08,756 Speaker 1: form in it because it's actually very hard to make 956 00:48:08,756 --> 00:48:12,196 Speaker 1: it for metal look nice. And so that's one of 957 00:48:12,196 --> 00:48:14,356 Speaker 1: the things we wanted to show. We want to show that, Okay, 958 00:48:14,396 --> 00:48:16,676 Speaker 1: you can actually have a form metal with a lot 959 00:48:16,676 --> 00:48:18,756 Speaker 1: of detail in it and still keep it bare metal 960 00:48:18,796 --> 00:48:22,236 Speaker 1: because it will look nice. Right, So, yeah, a lot 961 00:48:22,276 --> 00:48:24,756 Speaker 1: of design features of it. For me kind of represents 962 00:48:24,756 --> 00:48:27,436 Speaker 1: the type of personality and character that I have. But 963 00:48:27,516 --> 00:48:29,716 Speaker 1: I think that's how every car should be. You know, 964 00:48:29,716 --> 00:48:31,556 Speaker 1: people should be able to have that freedom to choose 965 00:48:31,596 --> 00:48:32,676 Speaker 1: what their cars look like. 966 00:48:34,876 --> 00:48:36,516 Speaker 2: How many skull tattoos do you have? 967 00:48:38,316 --> 00:48:46,036 Speaker 1: I've got three? Why it's it's so, yeah, it's an 968 00:48:46,036 --> 00:48:50,556 Speaker 1: interesting thing. So a skull for me represents kind of 969 00:48:50,596 --> 00:48:54,276 Speaker 1: and it's an abstract for death of ego. So I 970 00:48:54,276 --> 00:48:57,076 Speaker 1: have a tattoo on my thumb which is a skull 971 00:48:57,196 --> 00:49:01,156 Speaker 1: that's holding a microphone to his ears. And this was 972 00:49:01,196 --> 00:49:03,556 Speaker 1: a time where you know, I felt like, you know, 973 00:49:03,636 --> 00:49:05,276 Speaker 1: I had a good platform and I could talk a 974 00:49:05,276 --> 00:49:07,996 Speaker 1: lot and people would listen. But then I realized I 975 00:49:08,036 --> 00:49:10,436 Speaker 1: should yes, that's right, but I should maybe keep them 976 00:49:10,716 --> 00:49:13,436 Speaker 1: keep the mic close to my ears and also listen 977 00:49:13,476 --> 00:49:17,676 Speaker 1: as opposed to talk all the time, right, So I think. 978 00:49:17,956 --> 00:49:21,116 Speaker 2: Skull microphones don't work that way for the record, but. 979 00:49:21,116 --> 00:49:24,876 Speaker 1: I like it as a metaphor exactly. But I think 980 00:49:24,876 --> 00:49:27,716 Speaker 1: the idea of is around, you know, kind of reminders 981 00:49:27,716 --> 00:49:29,116 Speaker 1: of you can see a lot of my tattoos on 982 00:49:29,156 --> 00:49:31,676 Speaker 1: my on my hands, so it's a really reminder for 983 00:49:31,756 --> 00:49:36,196 Speaker 1: myself to know that, you know, be present and make 984 00:49:36,236 --> 00:49:38,676 Speaker 1: sure that you know you're not involved with your ego 985 00:49:38,756 --> 00:49:41,196 Speaker 1: too much and you can see others people's perspective. 986 00:49:41,836 --> 00:49:45,476 Speaker 2: Is there any tension between ego death and custom cars? 987 00:49:46,996 --> 00:49:50,316 Speaker 1: Tension between ego death and custom cars? I don't know. 988 00:49:50,436 --> 00:49:52,996 Speaker 2: I'm just playing, but like you know, custom car kind 989 00:49:52,996 --> 00:49:55,636 Speaker 2: of seems like, hey, look at me, I'm special, and 990 00:49:55,716 --> 00:49:58,476 Speaker 2: ego death seems like, oh, don't look at me, I'm 991 00:49:58,476 --> 00:49:59,116 Speaker 2: not so special. 992 00:49:59,316 --> 00:50:01,996 Speaker 1: Yeah, no, I think I think the difference is I think, yeah, 993 00:50:01,996 --> 00:50:04,636 Speaker 1: if you have attachment to your custom card, then maybe 994 00:50:04,996 --> 00:50:07,556 Speaker 1: there's tension. But I more think of it in terms 995 00:50:07,556 --> 00:50:10,476 Speaker 1: of expression. Right. You know, you can be an artist, 996 00:50:10,636 --> 00:50:12,996 Speaker 1: you can. You can you can design your home the 997 00:50:13,076 --> 00:50:16,356 Speaker 1: way it expresses you. You can design the theme of 998 00:50:16,396 --> 00:50:18,756 Speaker 1: your podcast the way it expresses you. You can design 999 00:50:18,836 --> 00:50:21,596 Speaker 1: your car. Also, the way it expresses you. I think 1000 00:50:21,636 --> 00:50:23,676 Speaker 1: it's leus so about oh look at me, I'm special. 1001 00:50:23,676 --> 00:50:26,636 Speaker 1: It's more like, here's my expression to the world for 1002 00:50:26,676 --> 00:50:30,516 Speaker 1: the people to see. But I think that expressiveness is 1003 00:50:30,596 --> 00:50:32,716 Speaker 1: it is pretty amazing. I think that's uniquely one of 1004 00:50:32,756 --> 00:50:34,876 Speaker 1: the unique things about humans that like, you know, we 1005 00:50:34,876 --> 00:50:37,196 Speaker 1: we I think all we do when we come to 1006 00:50:37,196 --> 00:50:40,596 Speaker 1: this world is expressing ourselves right, expressing uself through our work, 1007 00:50:40,716 --> 00:50:44,956 Speaker 1: expressing through ourselves through our relationships. And if you can 1008 00:50:45,076 --> 00:50:48,436 Speaker 1: enable people to express themselves better better, I think that's great. 1009 00:50:49,156 --> 00:50:51,596 Speaker 1: But if you get attached to your expressions and your 1010 00:50:51,636 --> 00:50:54,036 Speaker 1: ideas and your thoughts and think, oh, I'm better than 1011 00:50:54,036 --> 00:50:56,956 Speaker 1: everybody else, and I think that that becomes that becomes 1012 00:50:56,956 --> 00:50:58,516 Speaker 1: a little bit of an ego driven trip. 1013 00:51:05,996 --> 00:51:09,676 Speaker 2: Edward Mayer is the co founder and CEO of Mocking Labs. 1014 00:51:10,356 --> 00:51:13,636 Speaker 2: Today's show was produced by Gabriel Hunter Cheang. It was 1015 00:51:13,876 --> 00:51:17,356 Speaker 2: edited by Lyddy jeene Kott and engineered by Sarah Bruginner. 1016 00:51:17,836 --> 00:51:20,996 Speaker 2: You can email us at problem at Pushkin dot FM. 1017 00:51:21,556 --> 00:51:23,876 Speaker 2: I'm Jacob Goldstein and we'll be back next week with 1018 00:51:23,956 --> 00:51:32,916 Speaker 2: another episode of What's Your Problem.