1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,396 --> 00:00:23,996 Speaker 2: When we started this show in twenty twenty two, the 3 00:00:24,076 --> 00:00:27,996 Speaker 2: standard line about driverless cars was driverless cars have been 4 00:00:28,076 --> 00:00:31,796 Speaker 2: five years away for the past fifteen years, because it 5 00:00:31,876 --> 00:00:35,796 Speaker 2: seemed like they were just always around the corner, always 6 00:00:35,916 --> 00:00:38,396 Speaker 2: just a few years away, but they never quite arrived. 7 00:00:39,276 --> 00:00:43,476 Speaker 2: Nobody says that anymore. Today, in several cities around the country, 8 00:00:43,676 --> 00:00:46,236 Speaker 2: getting a ride from a driverless car is just a 9 00:00:46,316 --> 00:00:49,436 Speaker 2: normal thing people do, and it'll become normal in more 10 00:00:49,476 --> 00:00:52,476 Speaker 2: and more and more cities over the next few years. 11 00:00:52,796 --> 00:00:56,036 Speaker 2: Driverless cars are here now, so now we can ask 12 00:00:56,556 --> 00:01:05,356 Speaker 2: what's next. I'm Jacob Goldstein and this is What's Your Problem, 13 00:01:05,436 --> 00:01:07,196 Speaker 2: the show where I talk to people who are trying 14 00:01:07,236 --> 00:01:11,076 Speaker 2: to make technological progress. My guest today is Boris Soffman. 15 00:01:11,516 --> 00:01:15,036 Speaker 2: He's the co founder and CEO of Bedrock Robotics, a 16 00:01:15,076 --> 00:01:18,676 Speaker 2: company that's figuring out how to retrofit heavy equipment to 17 00:01:18,716 --> 00:01:22,996 Speaker 2: make it work autonomously. Boris's problem is this, how do 18 00:01:23,036 --> 00:01:25,876 Speaker 2: you teach machines not just to drive, but to do 19 00:01:25,956 --> 00:01:30,196 Speaker 2: things like grade roads and move heavy things around construction sites. 20 00:01:31,116 --> 00:01:34,476 Speaker 2: Boris's company is starting with excavators, and they plan to 21 00:01:34,516 --> 00:01:39,196 Speaker 2: have their first commercial excavators autonomously digging holes on construction 22 00:01:39,316 --> 00:01:43,676 Speaker 2: projects next year. Later in the interview, Boris goes big. 23 00:01:43,716 --> 00:01:47,036 Speaker 2: He argues that if Bedrock succeeds, the company could help 24 00:01:47,076 --> 00:01:51,076 Speaker 2: push forward a broad wave of new building in America. 25 00:01:51,796 --> 00:01:55,396 Speaker 2: But first we talked about his time at Weimo and 26 00:01:55,436 --> 00:01:58,236 Speaker 2: how the wild evolution of the autonomous vehicles that he 27 00:01:58,276 --> 00:02:01,156 Speaker 2: worked on there led him to start Bedrock. 28 00:02:01,836 --> 00:02:04,716 Speaker 1: The time at Weimo was this incredible period where I 29 00:02:04,836 --> 00:02:07,836 Speaker 1: was there for about five years from mid twenty nineteen 30 00:02:07,956 --> 00:02:10,956 Speaker 1: till through spring twenty four, and that was this really 31 00:02:10,956 --> 00:02:13,836 Speaker 1: big period where it was going through this like fifteen 32 00:02:13,956 --> 00:02:16,956 Speaker 1: years of R and D, and then it finally transitioned 33 00:02:16,956 --> 00:02:19,876 Speaker 1: into this hockey stick of like growth that is helping today. 34 00:02:19,876 --> 00:02:22,316 Speaker 1: And so today waymos at over one hundred million miles 35 00:02:22,356 --> 00:02:25,036 Speaker 1: fully driver lists. It's at five times safer than a human. 36 00:02:25,316 --> 00:02:27,316 Speaker 1: It's millions of miles every single week, and so it's 37 00:02:27,356 --> 00:02:30,436 Speaker 1: kind of scaling itsponentially and it's like a genuinely fantastic product. 38 00:02:30,836 --> 00:02:32,676 Speaker 1: But I was there when we were like stressing over 39 00:02:32,716 --> 00:02:35,036 Speaker 1: the first hundred miles, and it felt like the most 40 00:02:35,076 --> 00:02:38,476 Speaker 1: incredible achievement to just go like ten fifty hundred miles 41 00:02:38,556 --> 00:02:40,716 Speaker 1: to completely drop less, say now that happens at like 42 00:02:40,836 --> 00:02:43,156 Speaker 1: hundreds of thousands of miles every single day. And so 43 00:02:44,436 --> 00:02:46,076 Speaker 1: one of the things that really broke through it made 44 00:02:46,076 --> 00:02:49,276 Speaker 1: that possible is this shift to machine learning and data 45 00:02:49,356 --> 00:02:51,716 Speaker 1: driven approaches as a core of the autonomy stack. 46 00:02:52,116 --> 00:02:54,756 Speaker 2: And just to be clear, like that's as opposed to 47 00:02:54,876 --> 00:02:58,556 Speaker 2: a more like euristics or rule based kind of model, 48 00:02:58,676 --> 00:03:01,836 Speaker 2: like the old school twentieth century eyes, like, well, just 49 00:03:01,876 --> 00:03:04,356 Speaker 2: tell the car all the rules of how to drive 50 00:03:04,556 --> 00:03:07,556 Speaker 2: and then it'll drive. Like that's what you're comparing machine 51 00:03:07,596 --> 00:03:07,956 Speaker 2: learning to. 52 00:03:08,396 --> 00:03:10,916 Speaker 1: Yeah, like drive, I'm going to like embed the cost 53 00:03:10,956 --> 00:03:13,436 Speaker 1: functions and all this. So yeah, so like large scale 54 00:03:13,436 --> 00:03:16,276 Speaker 1: search and heuristics and rules and you can embed them 55 00:03:16,276 --> 00:03:18,556 Speaker 1: now inside those heuristics, and so you can solve almost 56 00:03:18,556 --> 00:03:20,876 Speaker 1: any given problem with that sort of approach. 57 00:03:21,836 --> 00:03:24,276 Speaker 2: So you can solve one problem, but you can't solve 58 00:03:24,316 --> 00:03:27,436 Speaker 2: like every single possible problem that would ever arise when 59 00:03:27,476 --> 00:03:30,076 Speaker 2: you're driving, which is actually what you have to solve 60 00:03:30,156 --> 00:03:32,036 Speaker 2: to do full autonomy. 61 00:03:31,636 --> 00:03:33,996 Speaker 1: Right right with activating whackamall, where like you fix one 62 00:03:34,036 --> 00:03:35,876 Speaker 1: problem and it becomes harder and harder to kind of 63 00:03:35,916 --> 00:03:38,276 Speaker 1: scale the other ones. And so and so there was 64 00:03:38,316 --> 00:03:41,796 Speaker 1: this really conscious shift at WEIMA, which was kind of 65 00:03:41,796 --> 00:03:43,676 Speaker 1: bold at the time because it feels obvious in hindsight, 66 00:03:43,716 --> 00:03:46,036 Speaker 1: It wasn't obvious at all back then. The shift to 67 00:03:46,156 --> 00:03:49,796 Speaker 1: like really embracing this as a data driven solution where 68 00:03:49,836 --> 00:03:52,916 Speaker 1: you're learning from human driving and human behavior. 69 00:03:52,996 --> 00:03:55,916 Speaker 2: And when you say you're learning, you mean the machine 70 00:03:55,996 --> 00:03:58,676 Speaker 2: learning model. The AI basically is learning. 71 00:03:58,436 --> 00:04:01,036 Speaker 1: The machine learning model, that's right. And so you're basically 72 00:04:01,076 --> 00:04:03,596 Speaker 1: taking giant scales of data one hundreds of thousands of 73 00:04:03,596 --> 00:04:06,676 Speaker 1: millions of miles, and you're learning the model of how 74 00:04:06,676 --> 00:04:09,316 Speaker 1: do you drive and how do you interpret this? Like 75 00:04:09,676 --> 00:04:12,236 Speaker 1: infinite compocity and kitchen sink of contacts, like all of 76 00:04:12,236 --> 00:04:14,956 Speaker 1: this sensored data, the road structures, the things you see 77 00:04:14,996 --> 00:04:17,036 Speaker 1: around you, the way people are moving. How do you 78 00:04:17,076 --> 00:04:20,396 Speaker 1: go from a kind of engineered and you know, kind 79 00:04:20,436 --> 00:04:22,996 Speaker 1: of trained solution into one that is like dominantly a 80 00:04:23,036 --> 00:04:23,676 Speaker 1: warned solution. 81 00:04:24,116 --> 00:04:26,236 Speaker 2: I mean, as you're doing that, you're sort of riding 82 00:04:26,316 --> 00:04:31,236 Speaker 2: the historic machine learning AI wave, right, Like that's right. 83 00:04:31,316 --> 00:04:34,316 Speaker 2: Presumably you're able to do that at that moment because 84 00:04:34,716 --> 00:04:38,196 Speaker 2: of this explosion in AI, which is basically machine learning, right, 85 00:04:38,236 --> 00:04:38,676 Speaker 2: you know that. 86 00:04:38,876 --> 00:04:40,636 Speaker 1: And you couldn't have done that five ten years ago. 87 00:04:40,836 --> 00:04:42,876 Speaker 2: That's what you're kind of coming out of at way MO, 88 00:04:43,036 --> 00:04:45,716 Speaker 2: that's driving this success at way MOO. How does that 89 00:04:46,236 --> 00:04:47,836 Speaker 2: set you up for what you're doing a bedrock? 90 00:04:48,236 --> 00:04:50,636 Speaker 1: The most shocking thing was just how well does generalized 91 00:04:50,676 --> 00:04:53,196 Speaker 1: from San Francisco to Los Angeles, Phoenix in Austin and 92 00:04:53,236 --> 00:04:56,516 Speaker 1: then becomes almost like a qualification problem eventually, where you're 93 00:04:56,596 --> 00:04:58,476 Speaker 1: using data to fill some gaps, but like you need 94 00:04:58,556 --> 00:05:00,236 Speaker 1: less and less of it, less and less new things 95 00:05:00,276 --> 00:05:02,996 Speaker 1: surprise you, like you're just your competency kind of expands. 96 00:05:03,396 --> 00:05:06,196 Speaker 1: And then we unified the technology stack between cars and 97 00:05:06,196 --> 00:05:08,796 Speaker 1: trucks where even jumping from a cargo a truck was 98 00:05:09,236 --> 00:05:11,076 Speaker 1: needed maybe ten to fifty percent more data, but it 99 00:05:11,076 --> 00:05:13,876 Speaker 1: was fundamentally like you're using data to explain why how 100 00:05:13,916 --> 00:05:15,196 Speaker 1: you operate a very different potages. 101 00:05:15,236 --> 00:05:16,756 Speaker 2: This is like a big truck, This is like an 102 00:05:16,796 --> 00:05:19,116 Speaker 2: eighteen wheel truck. When Google was working on that, yeah, 103 00:05:19,116 --> 00:05:19,996 Speaker 2: are way most like. 104 00:05:19,996 --> 00:05:23,676 Speaker 1: Fifty three foot trailer like eighty thousand pounds. And so 105 00:05:23,756 --> 00:05:27,276 Speaker 1: that was the big moment where we started thinking about 106 00:05:27,316 --> 00:05:28,996 Speaker 1: where else can you apply this? What are the places 107 00:05:28,996 --> 00:05:32,396 Speaker 1: where you have all this diversity of challenges and capabilities 108 00:05:32,396 --> 00:05:35,396 Speaker 1: that benefit from this type of versatility and also have 109 00:05:35,476 --> 00:05:38,036 Speaker 1: the ability to jump between platforms in this really natural way. 110 00:05:38,116 --> 00:05:41,036 Speaker 1: And so we looked at a lot of spaces and 111 00:05:41,076 --> 00:05:44,476 Speaker 1: really settled on automation of specialized typing machinery, and so 112 00:05:44,516 --> 00:05:47,796 Speaker 1: the sipes of machines that you see in construction, like 113 00:05:47,876 --> 00:05:50,516 Speaker 1: excavators and wheeloaders and bulldozers, but also frankly the sort 114 00:05:50,516 --> 00:05:53,396 Speaker 1: of machines you see in all sorts of industries like 115 00:05:53,716 --> 00:05:57,756 Speaker 1: agriculture and mining and lumper and garbage movement. And so 116 00:05:57,836 --> 00:06:00,476 Speaker 1: you have these very diverse types of machines that are 117 00:06:00,516 --> 00:06:03,396 Speaker 1: interacting with the world around them. They're fairly still moving, 118 00:06:03,436 --> 00:06:05,716 Speaker 1: they're in semi controlled environments, and at the end of 119 00:06:05,756 --> 00:06:09,756 Speaker 1: the day, there's astronomical scale at which they great at 120 00:06:09,836 --> 00:06:12,676 Speaker 1: and a lot of the learnings that we experienced a 121 00:06:12,756 --> 00:06:15,636 Speaker 1: weymle actually transfer over incredibly well. But the physics of 122 00:06:15,676 --> 00:06:17,756 Speaker 1: the problem is a lot less aperspherial. It's actually like, 123 00:06:17,836 --> 00:06:19,476 Speaker 1: really tackle this and get the market. 124 00:06:19,916 --> 00:06:26,396 Speaker 2: Basically, big vehicles operating in semi constrained environments doing things 125 00:06:26,436 --> 00:06:28,796 Speaker 2: to the world, interacting with the world in some physical 126 00:06:28,836 --> 00:06:30,676 Speaker 2: way in addition to just driving across it. 127 00:06:30,836 --> 00:06:33,316 Speaker 1: That's right, and these are slow moving vehicles where you're 128 00:06:33,316 --> 00:06:36,596 Speaker 1: already on a closed site with people who are assumed 129 00:06:36,596 --> 00:06:39,236 Speaker 1: to be knowledgeable about the world around you, and you're 130 00:06:39,236 --> 00:06:41,636 Speaker 1: moving at like five off an hour, for example, You're 131 00:06:41,676 --> 00:06:43,556 Speaker 1: able to slow down and stop. You're able to minimize 132 00:06:43,556 --> 00:06:46,516 Speaker 1: your exposure. You always have a minimum safety condition of stopping. 133 00:06:46,716 --> 00:06:49,836 Speaker 1: Your complexity is less abound interactions with others on the 134 00:06:49,916 --> 00:06:52,236 Speaker 1: road and more about the interactions with the world around you. 135 00:06:52,756 --> 00:06:55,796 Speaker 1: And so you can actually tackle safety not through this 136 00:06:55,876 --> 00:06:58,876 Speaker 1: sort of statistical methods that drove a lot of the 137 00:06:58,916 --> 00:07:03,436 Speaker 1: mileage that we collect, but through a much more direct 138 00:07:03,516 --> 00:07:05,676 Speaker 1: measure of your competencies in order to just make sure 139 00:07:05,676 --> 00:07:08,796 Speaker 1: that you're like you're actually capable of hurting somebody. 140 00:07:08,916 --> 00:07:12,316 Speaker 2: Yeah, more like like an industrial robot al most right. 141 00:07:12,236 --> 00:07:15,396 Speaker 1: That's right. Yeah, it's like a traditional Susis engineering, Yeah. 142 00:07:15,236 --> 00:07:17,396 Speaker 2: Where it's like, look, even if it can't do the 143 00:07:17,436 --> 00:07:19,516 Speaker 2: thing every time, that's fine. Just make sure it's not 144 00:07:19,556 --> 00:07:21,516 Speaker 2: gonna like go crazy and kill somebody. 145 00:07:21,556 --> 00:07:22,916 Speaker 1: That's right. You can make it to that. Like your 146 00:07:22,916 --> 00:07:25,676 Speaker 1: worst case is your productivity suffers, but safety wise, you're 147 00:07:25,956 --> 00:07:30,076 Speaker 1: you're you're solid. And so that's like incredibly enabling because 148 00:07:30,076 --> 00:07:32,196 Speaker 1: your long tail is now no longer safety. It's the 149 00:07:32,276 --> 00:07:33,356 Speaker 1: versatility what you can do. 150 00:07:33,556 --> 00:07:35,156 Speaker 2: Why did you start with excavators? 151 00:07:36,236 --> 00:07:40,396 Speaker 1: So excavators are the most highly utilized machine, Like they're 152 00:07:40,516 --> 00:07:43,596 Speaker 1: usually the highest volume in fleet. So between twenty and 153 00:07:43,676 --> 00:07:45,796 Speaker 1: twenty five percent of fleets are excavators. 154 00:07:45,796 --> 00:07:48,716 Speaker 2: Fleets are just like big construction heavy equipment. 155 00:07:48,836 --> 00:07:51,676 Speaker 1: Yeah, like a general contractor that alone, like a thousand 156 00:07:51,716 --> 00:07:54,076 Speaker 1: machines two hundred, two hundred and fifty will probably be 157 00:07:54,116 --> 00:07:55,556 Speaker 1: excavators on average. 158 00:07:55,596 --> 00:07:57,676 Speaker 2: And it's basically what a kid would call it digger, right, 159 00:07:57,716 --> 00:08:00,476 Speaker 2: It's like at a digger bucket an arm, and it 160 00:08:00,636 --> 00:08:01,636 Speaker 2: like digs stuff up. 161 00:08:01,756 --> 00:08:03,596 Speaker 1: It's like the equivalent of an arm. Like you can 162 00:08:03,676 --> 00:08:05,876 Speaker 1: like dig stuff, you can demolish stuff, you can swap 163 00:08:05,956 --> 00:08:08,116 Speaker 1: your tools, you can like lift pipes and put them 164 00:08:08,116 --> 00:08:09,396 Speaker 1: in a holes. They can do a ton of stuff 165 00:08:09,396 --> 00:08:11,196 Speaker 1: with it. It's kind of crazy. It really is a 166 00:08:11,196 --> 00:08:13,916 Speaker 1: personal machine. It also makes it very complicated to work. 167 00:08:13,956 --> 00:08:16,476 Speaker 1: So there's like seven degrees of freedom sometimes eight, and 168 00:08:16,556 --> 00:08:18,676 Speaker 1: so it's one of the hardest machines to learn, and 169 00:08:18,716 --> 00:08:20,716 Speaker 1: it takes four to five years to really become an expert. 170 00:08:20,796 --> 00:08:22,836 Speaker 1: And there's a huge difference between an expert and novice, 171 00:08:22,876 --> 00:08:25,436 Speaker 1: and so you kind of have this situation where it's 172 00:08:25,516 --> 00:08:28,876 Speaker 1: a huge volume of work and it's really hard to learn, 173 00:08:28,876 --> 00:08:30,076 Speaker 1: and so you have a really deep pull in the 174 00:08:30,076 --> 00:08:31,676 Speaker 1: market for it, meaning. 175 00:08:31,516 --> 00:08:33,356 Speaker 2: A lot of demand, like a lot of people want 176 00:08:33,956 --> 00:08:36,676 Speaker 2: somebody who can drive one of these or a machine 177 00:08:36,676 --> 00:08:37,236 Speaker 2: that could do it. 178 00:08:37,396 --> 00:08:39,196 Speaker 1: Well, let me tell you about the demand. I've never 179 00:08:39,236 --> 00:08:42,236 Speaker 1: seen such a divergence of supply to demand like in 180 00:08:42,276 --> 00:08:45,196 Speaker 1: my career in any case, where on the demand side, 181 00:08:45,236 --> 00:08:48,916 Speaker 1: you have this astronomical construction industry that's already two trillion 182 00:08:48,956 --> 00:08:51,436 Speaker 1: dollars a year in the US, that's obviously very heavily 183 00:08:51,476 --> 00:08:54,516 Speaker 1: building and having machinery work. And then you have this 184 00:08:54,636 --> 00:08:57,116 Speaker 1: like shortage of operators that already existed, but it's going 185 00:08:57,116 --> 00:08:59,716 Speaker 1: in the wrong direction where forty percent of construction workers 186 00:08:59,836 --> 00:09:02,356 Speaker 1: is retiring in the next ten years. Our partners are 187 00:09:02,916 --> 00:09:06,996 Speaker 1: consistently having trouble filling labor. We've met some that have 188 00:09:07,076 --> 00:09:08,196 Speaker 1: one hundred percent turnover. 189 00:09:09,196 --> 00:09:10,556 Speaker 2: Everybody leaves every year. 190 00:09:10,756 --> 00:09:12,996 Speaker 1: It means like a third of people might be lifers, 191 00:09:13,036 --> 00:09:15,756 Speaker 1: but then like two thirds transition more than once per year, 192 00:09:15,756 --> 00:09:19,156 Speaker 1: and you're constantly backfilling the skill sets very a ton 193 00:09:19,676 --> 00:09:22,276 Speaker 1: and one of them said that for every one person 194 00:09:22,396 --> 00:09:24,516 Speaker 1: entering the workforce of the quality that they look for, 195 00:09:24,596 --> 00:09:27,356 Speaker 1: their seven leaving. And so we end up having is 196 00:09:27,876 --> 00:09:30,796 Speaker 1: this shortage of ability to meet this demand, and so 197 00:09:30,916 --> 00:09:33,836 Speaker 1: prices go up, jobs don't get done. And what's interesting 198 00:09:33,916 --> 00:09:36,036 Speaker 1: is it's not that's like isolated industry that's like just 199 00:09:36,156 --> 00:09:38,196 Speaker 1: on its own kind of like having these sort of challenges. 200 00:09:38,556 --> 00:09:41,556 Speaker 1: It's a horizontal it supports every industry. You can't build 201 00:09:41,636 --> 00:09:44,116 Speaker 1: data centers free eye without it, you can't build houses, 202 00:09:44,196 --> 00:09:46,236 Speaker 1: you can't build energy facilities. 203 00:09:46,036 --> 00:09:49,036 Speaker 2: Of like a rate limiting skill or rate limiting machine. 204 00:09:49,276 --> 00:09:51,156 Speaker 1: It's a whole country. That's exactly right. And then you 205 00:09:51,196 --> 00:09:54,596 Speaker 1: have this need that starts with labor. But then there's safety. 206 00:09:54,636 --> 00:09:57,796 Speaker 1: It's huge amounts of safety challenges. You have, you know, 207 00:09:57,956 --> 00:10:01,996 Speaker 1: huge predictability challenges. So if you actually could soften out 208 00:10:01,996 --> 00:10:04,676 Speaker 1: some of these constraints, we would build more, more work 209 00:10:04,716 --> 00:10:06,996 Speaker 1: would get done, and it would like stimulate the whole economy. 210 00:10:06,996 --> 00:10:09,316 Speaker 1: And so that's what's actually pretty exciting about this opportunity. 211 00:10:09,636 --> 00:10:11,996 Speaker 1: It's not as yours some game at all good. 212 00:10:12,076 --> 00:10:14,836 Speaker 2: So, like you decide to focus on excavators, you you know, 213 00:10:14,956 --> 00:10:16,996 Speaker 2: you raise money for your company, Like, do you go 214 00:10:17,076 --> 00:10:21,076 Speaker 2: out and buy a whatever, a million dollar excavator. You 215 00:10:21,076 --> 00:10:23,276 Speaker 2: go to what is it bucket and shovel dot com 216 00:10:23,316 --> 00:10:24,756 Speaker 2: and buy yourself an excavator. 217 00:10:25,236 --> 00:10:28,116 Speaker 1: They're like three hundred thousand to five hundred thousand, so 218 00:10:28,156 --> 00:10:30,796 Speaker 1: they're like still expensive. Yeah, they're pretty expensive. 219 00:10:30,876 --> 00:10:32,796 Speaker 2: So I mean like, did you buy one like we did? 220 00:10:32,956 --> 00:10:33,636 Speaker 1: Yeah? 221 00:10:33,716 --> 00:10:34,476 Speaker 2: Did you drive it? 222 00:10:34,596 --> 00:10:36,356 Speaker 1: Of course, like you have to. It's like it's like 223 00:10:36,356 --> 00:10:38,596 Speaker 1: a ride of passage. Yeah, they're really fine. I even 224 00:10:38,596 --> 00:10:41,396 Speaker 1: took a lessoners. There's a place in Vegas ALESSI drive 225 00:10:41,436 --> 00:10:44,156 Speaker 1: excavators and take away and from like a trade operator. 226 00:10:44,156 --> 00:10:44,756 Speaker 1: It was pretty fun. 227 00:10:44,796 --> 00:10:47,556 Speaker 2: So, oh, that's genius. You get to break things. I 228 00:10:47,596 --> 00:10:50,276 Speaker 2: was actually as I was walking here today to the 229 00:10:50,356 --> 00:10:53,876 Speaker 2: train that there's a playground and there was an excavator 230 00:10:53,916 --> 00:10:57,116 Speaker 2: just like breaking up the asphalt and then like prying 231 00:10:57,156 --> 00:10:59,276 Speaker 2: it up. I was like, that looks awesome. 232 00:10:59,676 --> 00:11:02,156 Speaker 1: This is the funniest thing. Anybody that gets an excavator, 233 00:11:02,236 --> 00:11:04,476 Speaker 1: like your inner six year old comes out. Suddenly you 234 00:11:04,556 --> 00:11:07,076 Speaker 1: have like the most mature, sophisticated person is trying to 235 00:11:07,076 --> 00:11:09,636 Speaker 1: be all professional. You get an excavator and just like 236 00:11:09,996 --> 00:11:12,236 Speaker 1: get like a giant glob of mud and then like 237 00:11:12,476 --> 00:11:14,636 Speaker 1: bring as hig as a can and then like PLoP 238 00:11:14,676 --> 00:11:16,716 Speaker 1: it down and see what happens. It's amazing, and it's like, 239 00:11:16,716 --> 00:11:18,836 Speaker 1: without fail, everybody kind of reverts back to this sort 240 00:11:18,836 --> 00:11:20,476 Speaker 1: of a world. Yeah, we drove and I think we 241 00:11:20,556 --> 00:11:23,196 Speaker 1: bought like a half dozen excavators at this point, and 242 00:11:23,236 --> 00:11:25,316 Speaker 1: then we also then use a lot of excavators from 243 00:11:25,316 --> 00:11:28,876 Speaker 1: our partners who are general contractors and subcontractors. 244 00:11:28,956 --> 00:11:30,796 Speaker 2: And it's just what a year or so ago that 245 00:11:30,836 --> 00:11:32,276 Speaker 2: you started, like not that long in. 246 00:11:32,236 --> 00:11:34,436 Speaker 1: This less than a year and a half, yeah, like 247 00:11:34,516 --> 00:11:36,956 Speaker 1: last last minute, seah, it's a pretty been a good run. 248 00:11:37,716 --> 00:11:40,636 Speaker 2: How much is sort of commodified of autonomy, right? How 249 00:11:40,676 --> 00:11:42,916 Speaker 2: much is just like, well, we're gonna buy this, this, 250 00:11:42,996 --> 00:11:44,636 Speaker 2: and this in terms of sort of hardware and we 251 00:11:44,676 --> 00:11:46,356 Speaker 2: know the software, and then how much is like, oh, 252 00:11:46,396 --> 00:11:48,596 Speaker 2: here's the things we have to figure out that nobody 253 00:11:48,596 --> 00:11:49,156 Speaker 2: knows how to do. 254 00:11:49,676 --> 00:11:52,556 Speaker 1: So it's one of the other enablers that's like way 255 00:11:52,556 --> 00:11:54,076 Speaker 1: better than what we would have had to go through 256 00:11:54,116 --> 00:11:56,596 Speaker 1: ten years ago. In the space, we can use a 257 00:11:56,636 --> 00:11:59,436 Speaker 1: lot of the existing components on lighter on cameras, on 258 00:11:59,516 --> 00:12:04,116 Speaker 1: Imus GPS, there's a lot of tailwind from automotive great 259 00:12:04,196 --> 00:12:05,076 Speaker 1: cameras and compute. 260 00:12:05,076 --> 00:12:08,236 Speaker 2: That's like accelerating and like presumably they're buying those things 261 00:12:08,276 --> 00:12:10,876 Speaker 2: at scale you can like you've got to go get 262 00:12:10,916 --> 00:12:13,596 Speaker 2: them cheap. Essentially, you're like, give me one of those, 263 00:12:13,596 --> 00:12:14,716 Speaker 2: one of those, one of those. 264 00:12:14,516 --> 00:12:16,956 Speaker 1: That's right, because you're gonna go to millions and millions 265 00:12:16,996 --> 00:12:19,516 Speaker 1: of units as like every car against an autopilot equivalent 266 00:12:19,556 --> 00:12:21,596 Speaker 1: over the next two three years. Oh, that's starting to 267 00:12:21,636 --> 00:12:23,316 Speaker 1: kind of pump in a cost. So that helps. And 268 00:12:23,356 --> 00:12:27,436 Speaker 1: then even the platform, we can retrofit existing machinery. 269 00:12:27,716 --> 00:12:29,596 Speaker 2: When you say the platform, what do you mean in 270 00:12:29,596 --> 00:12:30,796 Speaker 2: this context. 271 00:12:30,476 --> 00:12:33,116 Speaker 1: Platform is like the car, the truck, the excavator, and 272 00:12:33,156 --> 00:12:35,316 Speaker 1: the way in this case the excavator. Yeah, the machine, 273 00:12:35,356 --> 00:12:40,196 Speaker 1: so like the excavator itself. Construction machines, particularly this latest generation, 274 00:12:40,676 --> 00:12:44,156 Speaker 1: they're really well designed to where they're already effectively drive 275 00:12:44,196 --> 00:12:46,156 Speaker 1: by wire, which means that every signal going through the 276 00:12:46,156 --> 00:12:49,276 Speaker 1: machine is electric, and so we're able to spice into 277 00:12:49,356 --> 00:12:54,396 Speaker 1: it and both read and write to these signals. 278 00:12:54,036 --> 00:12:58,516 Speaker 2: And control them electric meaning like computerized basically. 279 00:12:58,356 --> 00:13:01,076 Speaker 1: Meaning like you have a joystick that operates the excavatory 280 00:13:01,316 --> 00:13:04,076 Speaker 1: that's not physically connected to the hydraulics. It's an electric 281 00:13:04,116 --> 00:13:06,756 Speaker 1: signal that's connected to the hydraulics, and so the same 282 00:13:06,796 --> 00:13:09,516 Speaker 1: for every signal and the machine, like the sense the 283 00:13:09,836 --> 00:13:12,716 Speaker 1: pressure gages, and so we're able to both get the 284 00:13:12,796 --> 00:13:15,836 Speaker 1: data from the machine as well as control the machine 285 00:13:16,356 --> 00:13:20,276 Speaker 1: through a non invasive integration where we can upit a 286 00:13:20,316 --> 00:13:22,676 Speaker 1: machine with our sensors and compute suite in like less 287 00:13:22,676 --> 00:13:26,196 Speaker 1: than three hours and make it autonomy capable and it's 288 00:13:26,196 --> 00:13:28,716 Speaker 1: completely reversible. We could never do that with a car 289 00:13:28,836 --> 00:13:31,996 Speaker 1: or a truck because the platforms were just not designed 290 00:13:31,996 --> 00:13:32,316 Speaker 1: this way. 291 00:13:32,476 --> 00:13:35,436 Speaker 2: Yeah, so there's a lot that's there already. The excavators 292 00:13:35,476 --> 00:13:38,516 Speaker 2: themselves are sort of easy to integrate with lots of 293 00:13:38,516 --> 00:13:41,356 Speaker 2: off the shelf technology. What's not there, Like, when you're 294 00:13:41,356 --> 00:13:43,196 Speaker 2: coming in, what do you have to sort of build 295 00:13:43,236 --> 00:13:44,516 Speaker 2: that nobody has built before? 296 00:13:44,956 --> 00:13:49,476 Speaker 1: Nobody has actually created autonomy that can solve the really 297 00:13:49,516 --> 00:13:51,676 Speaker 1: nuanced problems that you need to solve in order to 298 00:13:51,676 --> 00:13:55,276 Speaker 1: operate like an excavator in construction tasks. 299 00:13:55,596 --> 00:13:59,116 Speaker 2: So let's talk like specifically, you buy your excavators, you 300 00:13:59,156 --> 00:14:02,796 Speaker 2: got your hardware, you know, whatever, what basic AI model 301 00:14:02,836 --> 00:14:05,436 Speaker 2: you're going to use, But like, what's a specific thing. 302 00:14:05,476 --> 00:14:07,196 Speaker 2: You have to figure out well a. 303 00:14:07,116 --> 00:14:08,596 Speaker 1: Lot of things. So first of all, it's not trivial 304 00:14:08,636 --> 00:14:11,956 Speaker 1: to tap into these machines and to build a platform 305 00:14:12,036 --> 00:14:14,836 Speaker 1: or kind of an integration around it where you can 306 00:14:15,116 --> 00:14:17,236 Speaker 1: read them, you can control them, and so you have 307 00:14:17,276 --> 00:14:19,716 Speaker 1: to basically create like a wrapper around the machine and 308 00:14:19,836 --> 00:14:22,276 Speaker 1: also design in a way that scales later to new machines. Right, 309 00:14:22,556 --> 00:14:25,436 Speaker 1: So that part's hard. It's a big autonomy problem. So 310 00:14:25,516 --> 00:14:27,516 Speaker 1: in that respect is no different than what we tackle 311 00:14:27,516 --> 00:14:30,476 Speaker 1: with a WEIMO, where you have to train massive scale models. 312 00:14:30,476 --> 00:14:31,836 Speaker 1: You have to collect a huge amount of data. 313 00:14:31,876 --> 00:14:34,516 Speaker 2: And in this case it's like about the digging, like 314 00:14:34,636 --> 00:14:37,156 Speaker 2: it's about the digging dumb question? Is the digging the 315 00:14:37,156 --> 00:14:39,116 Speaker 2: hard part? Like what happens when the thing hits the ground. 316 00:14:39,116 --> 00:14:41,036 Speaker 2: There's different kinds of stuff in the ground. 317 00:14:41,156 --> 00:14:43,836 Speaker 1: Like tell me in the category of what's new, suddenly 318 00:14:44,276 --> 00:14:47,396 Speaker 1: a car doesn't change the environment around it. You have 319 00:14:47,436 --> 00:14:50,956 Speaker 1: a really complicated tough earth and soil, and you got 320 00:14:50,996 --> 00:14:52,916 Speaker 1: to figure out the physics of how you dig through it, 321 00:14:53,196 --> 00:14:56,276 Speaker 1: how do you deal with clay versus top soil, how 322 00:14:56,316 --> 00:14:58,236 Speaker 1: do you deal with rocks inside? Then so, now if 323 00:14:58,276 --> 00:15:00,796 Speaker 1: you want to use simulation to solve parts of these problems. 324 00:15:00,916 --> 00:15:03,036 Speaker 1: You have a very complicated simulation problem because you have 325 00:15:03,076 --> 00:15:05,356 Speaker 1: to solve not just the sensor side, but also the 326 00:15:05,956 --> 00:15:09,596 Speaker 1: manipulation side. You have to structure this problem in a 327 00:15:09,636 --> 00:15:13,476 Speaker 1: way where you're learning from the data you collect in 328 00:15:13,556 --> 00:15:15,956 Speaker 1: order to actually capture the nuances of how do you 329 00:15:15,996 --> 00:15:18,396 Speaker 1: actually interact with the environment. You have to then control 330 00:15:18,476 --> 00:15:20,196 Speaker 1: and execute it the right way. You have to think 331 00:15:20,236 --> 00:15:22,156 Speaker 1: about how do you actually define the goal? 332 00:15:22,476 --> 00:15:24,956 Speaker 2: So from the user point of view with an excavator, 333 00:15:25,116 --> 00:15:27,236 Speaker 2: like how does it look like? I just I want 334 00:15:27,236 --> 00:15:29,156 Speaker 2: to dig a hole of this size at this spot? 335 00:15:29,236 --> 00:15:29,996 Speaker 2: Or like what is it? 336 00:15:30,276 --> 00:15:32,836 Speaker 1: Yeah, and this is a journey like fast forward you 337 00:15:32,876 --> 00:15:35,396 Speaker 1: know next year where card customer does this right, So 338 00:15:35,436 --> 00:15:38,116 Speaker 1: what they would do is they would give it a 339 00:15:38,116 --> 00:15:40,676 Speaker 1: model of what they want the earth to be dug 340 00:15:40,716 --> 00:15:42,516 Speaker 1: towards and so that would be like a three D 341 00:15:42,636 --> 00:15:47,676 Speaker 1: representation of the depth. So with length depth, here's the edges, 342 00:15:47,716 --> 00:15:50,516 Speaker 1: here's the constraints. The pickups for the trucks are going 343 00:15:50,556 --> 00:15:52,796 Speaker 1: to be over here and then go to work. And 344 00:15:52,836 --> 00:15:56,036 Speaker 1: basically these projects, like these machines will work for many 345 00:15:56,036 --> 00:15:59,276 Speaker 1: months at a time, continually just digging earth and working 346 00:15:59,276 --> 00:16:01,836 Speaker 1: it towards the right foundation, and the system will respect 347 00:16:01,876 --> 00:16:05,796 Speaker 1: that boundary and basically dig down to that level. It'll 348 00:16:05,796 --> 00:16:08,796 Speaker 1: have precision to the edges that you've defined, it'll do 349 00:16:08,796 --> 00:16:10,436 Speaker 1: it in a sequence that makes sense so that you 350 00:16:10,476 --> 00:16:12,596 Speaker 1: don't trap yourself in a hole, for example. And then 351 00:16:12,636 --> 00:16:16,516 Speaker 1: you specify where you're going to have dump truck pickups. 352 00:16:16,636 --> 00:16:19,396 Speaker 1: Then there's projects where this happens for nine months straight 353 00:16:19,476 --> 00:16:21,916 Speaker 1: or twelve months straight with like many machines, and so 354 00:16:22,236 --> 00:16:25,556 Speaker 1: what you basically need to specify is what you want 355 00:16:25,596 --> 00:16:27,356 Speaker 1: to dig to, and that what you want to dig 356 00:16:27,396 --> 00:16:31,276 Speaker 1: to ends up basically being the foundation that you're working 357 00:16:31,276 --> 00:16:33,716 Speaker 1: towards for whatever you're going to construct there. And we 358 00:16:33,756 --> 00:16:36,316 Speaker 1: want that to be versatile. So sometimes you're just taking 359 00:16:36,316 --> 00:16:38,756 Speaker 1: off a layer of top soil. Sometimes you're digging eight 360 00:16:38,756 --> 00:16:42,516 Speaker 1: feet deep. Sometimes you're taking an existing stockpile and moving 361 00:16:42,556 --> 00:16:45,156 Speaker 1: in somewhere else. So there's a lot of permutations of this. 362 00:16:45,276 --> 00:16:48,756 Speaker 1: Sometimes you're taking rubble from a demolition job and loading 363 00:16:48,796 --> 00:16:51,356 Speaker 1: it onto trucks, right, and so what's nice is that 364 00:16:51,436 --> 00:16:53,516 Speaker 1: you start to see patterns over and over again, and 365 00:16:53,556 --> 00:16:54,316 Speaker 1: this sort of work. 366 00:16:58,676 --> 00:17:10,116 Speaker 3: We'll be back in just a minute. 367 00:17:11,956 --> 00:17:14,276 Speaker 2: So the data problem is interesting here, right, Like you 368 00:17:14,316 --> 00:17:16,476 Speaker 2: were talking about the physics, just the physics of an 369 00:17:16,516 --> 00:17:19,356 Speaker 2: excavator is quite different, right, Like it's pushing down on 370 00:17:19,396 --> 00:17:21,676 Speaker 2: the ground, which is pushing back on the excavator, and 371 00:17:21,716 --> 00:17:24,156 Speaker 2: like as you say, the ground is changing because of 372 00:17:24,196 --> 00:17:28,076 Speaker 2: its work, and there's not Well where do you get 373 00:17:28,076 --> 00:17:28,996 Speaker 2: the data? 374 00:17:29,356 --> 00:17:31,876 Speaker 1: Data is actually an interesting mix. We get it both 375 00:17:31,956 --> 00:17:35,916 Speaker 1: on our test sites and also with our design partners. 376 00:17:35,956 --> 00:17:39,436 Speaker 1: So we were working with general contractors and subcontractors. Today 377 00:17:39,436 --> 00:17:42,196 Speaker 1: we have five that we're partnered with across southern states 378 00:17:42,196 --> 00:17:44,716 Speaker 1: like Arizona, Texas, and so we're getting data on their size. 379 00:17:44,716 --> 00:17:45,996 Speaker 1: We're getting data on our sites. 380 00:17:46,116 --> 00:17:48,916 Speaker 2: It's not that much data, right, Like I mean, I 381 00:17:48,916 --> 00:17:52,276 Speaker 2: guess my reference point is always like image net or 382 00:17:52,516 --> 00:17:54,956 Speaker 2: you know, the Internet for large language models. It does 383 00:17:54,996 --> 00:17:57,996 Speaker 2: seem like a sort of recurring problem in robotics type 384 00:17:57,996 --> 00:18:01,316 Speaker 2: AI applications is sparsity of data. 385 00:18:01,436 --> 00:18:02,116 Speaker 1: How do you get it? 386 00:18:02,596 --> 00:18:03,196 Speaker 2: Yeah? 387 00:18:03,396 --> 00:18:06,196 Speaker 1: Yeah, And so this is where there's a few kind 388 00:18:06,236 --> 00:18:08,676 Speaker 1: of intersuties. So first of all, like our partners together 389 00:18:08,676 --> 00:18:10,596 Speaker 1: have thousand and thousands of machines and so there's a 390 00:18:10,596 --> 00:18:12,916 Speaker 1: lot of choices of which she's a partner. On the 391 00:18:12,956 --> 00:18:14,716 Speaker 1: other thing that you can do is actually you can 392 00:18:14,756 --> 00:18:17,756 Speaker 1: be very clever on a test site, and when you're 393 00:18:17,756 --> 00:18:21,956 Speaker 1: on a real project, you're kind of getting a unbiasample. 394 00:18:21,956 --> 00:18:24,516 Speaker 1: You're just getting a random distribution of the things you see, 395 00:18:24,596 --> 00:18:26,516 Speaker 1: just like driving around on a road. When you're on 396 00:18:26,556 --> 00:18:29,796 Speaker 1: a test site, you can actually up sample the things 397 00:18:29,796 --> 00:18:31,996 Speaker 1: you actually want and you can go and you can 398 00:18:31,996 --> 00:18:35,236 Speaker 1: collect ten hours of data. There's representative of five thousand 399 00:18:35,236 --> 00:18:38,636 Speaker 1: hours of random data, but which is particularly useful for 400 00:18:38,676 --> 00:18:41,636 Speaker 1: things like safety situations. Right, you can actually create a 401 00:18:41,716 --> 00:18:45,156 Speaker 1: much larger equivalent amount of data on a close course 402 00:18:45,196 --> 00:18:47,156 Speaker 1: through like kind of structure testing. And so for example, 403 00:18:47,676 --> 00:18:50,876 Speaker 1: safety scenarios where you have weird interactions with people doing 404 00:18:50,956 --> 00:18:53,276 Speaker 1: things they shouldn't do. You don't wait to see that 405 00:18:53,356 --> 00:18:54,676 Speaker 1: on an open site. 406 00:18:55,076 --> 00:18:57,916 Speaker 2: What is one of those, like what is your nightmare 407 00:18:57,996 --> 00:18:59,396 Speaker 2: human behavior scenario? 408 00:18:59,636 --> 00:19:02,036 Speaker 1: In the field? Night very human behavior is a human 409 00:19:02,196 --> 00:19:05,516 Speaker 1: is curious, they walk up to your machine, Your machine stops, 410 00:19:05,516 --> 00:19:07,116 Speaker 1: and then they get really really close and they are 411 00:19:07,156 --> 00:19:09,036 Speaker 1: now in a blind spot where you can't see that. 412 00:19:09,516 --> 00:19:10,956 Speaker 1: But you still have to be smart enough to track 413 00:19:11,036 --> 00:19:12,956 Speaker 1: them where they're in a hole in front of you 414 00:19:12,996 --> 00:19:16,076 Speaker 1: while you're like thinking about digging, right, So occlusions from 415 00:19:16,196 --> 00:19:20,676 Speaker 1: humans is probably a huge category which is called plex. 416 00:19:20,836 --> 00:19:23,796 Speaker 2: Occlusions, meaning in your blind spot humans in a place 417 00:19:23,796 --> 00:19:25,076 Speaker 2: where the machine can't see them. 418 00:19:25,236 --> 00:19:28,276 Speaker 1: Yeah, or usually your number one priority is anything that 419 00:19:28,356 --> 00:19:31,836 Speaker 1: touches on human safety. That's sacred. You never take any 420 00:19:31,876 --> 00:19:32,476 Speaker 1: risks on that. 421 00:19:32,836 --> 00:19:34,716 Speaker 2: What's something you haven't figured out yet? 422 00:19:34,916 --> 00:19:38,236 Speaker 1: The things people do with excavators, Like we've seen them 423 00:19:38,916 --> 00:19:43,596 Speaker 1: do bizarre like clearing of debris. We've seen them load 424 00:19:43,636 --> 00:19:48,236 Speaker 1: a wheeloader with dirt. We've seen them bang tools to 425 00:19:48,436 --> 00:19:51,716 Speaker 1: change them. We've seen them bang tools to change them. 426 00:19:51,756 --> 00:19:54,156 Speaker 1: What's that one? Like a tool gets stuck and they 427 00:19:54,236 --> 00:19:56,396 Speaker 1: like bang it on the ground in order to get 428 00:19:56,396 --> 00:19:59,556 Speaker 1: it loose. They use their arm as a pivot point 429 00:19:59,716 --> 00:20:02,316 Speaker 1: to turn when the wheels are like stuck in mud. 430 00:20:02,476 --> 00:20:04,556 Speaker 2: It's kind of baller, Like it's pretty baller. 431 00:20:04,636 --> 00:20:06,556 Speaker 1: Yeah, it's like I mean, like when you see the 432 00:20:06,836 --> 00:20:08,876 Speaker 1: expert operators, they just show off and it's like it's 433 00:20:08,876 --> 00:20:11,716 Speaker 1: increa like they're awesome. And so there's like all these 434 00:20:11,756 --> 00:20:14,916 Speaker 1: like weird subtleties of how they'll use these tools in 435 00:20:14,956 --> 00:20:18,916 Speaker 1: like really subtle ways, which like the dimensions of on 436 00:20:18,956 --> 00:20:21,076 Speaker 1: the product side of the use cases that blew was away. 437 00:20:21,436 --> 00:20:24,116 Speaker 1: We thought about the obvious ones, but as we started 438 00:20:24,156 --> 00:20:28,196 Speaker 1: like really going deeper and like studying this, there's so 439 00:20:28,316 --> 00:20:30,076 Speaker 1: much diversity and interesting things that you can do with 440 00:20:30,116 --> 00:20:31,516 Speaker 1: these machines. It's quite powerful. 441 00:20:32,076 --> 00:20:34,676 Speaker 2: I mean, presumably you don't have to do all those 442 00:20:34,756 --> 00:20:37,156 Speaker 2: to have a product people will pay you for. Right, 443 00:20:37,196 --> 00:20:41,396 Speaker 2: you can just have the like competent automaton that can 444 00:20:41,956 --> 00:20:44,116 Speaker 2: dig a big hole just the way you want it. 445 00:20:44,596 --> 00:20:46,436 Speaker 1: That's correct, and that there's a huge amount of work 446 00:20:46,436 --> 00:20:49,316 Speaker 1: even in those areas. But there is this like tale 447 00:20:49,316 --> 00:20:50,756 Speaker 1: that you go in you over time go and add 448 00:20:50,796 --> 00:20:52,436 Speaker 1: more and more capability, and then I think it is 449 00:20:52,596 --> 00:20:54,796 Speaker 1: just as softwareupdates and you get more. It's just like 450 00:20:54,836 --> 00:20:57,756 Speaker 1: a you know, the product ends up being the digital 451 00:20:57,796 --> 00:21:00,156 Speaker 1: operator which gets better over time, and so it's similar 452 00:21:00,236 --> 00:21:01,476 Speaker 1: to human getting better. 453 00:21:01,956 --> 00:21:03,436 Speaker 2: What's the business model? 454 00:21:03,676 --> 00:21:06,116 Speaker 1: So we will go operator out next year. So that'll 455 00:21:06,156 --> 00:21:09,316 Speaker 1: be our first, like fully operator list product that is 456 00:21:09,876 --> 00:21:12,196 Speaker 1: in a spirit of like what this is meant to scale. 457 00:21:11,916 --> 00:21:15,676 Speaker 2: As operator out means driver list means autonomous. Is that 458 00:21:15,676 --> 00:21:16,156 Speaker 2: what it means? 459 00:21:16,276 --> 00:21:18,796 Speaker 1: Yeah, nobody there, It's just like operator lists. Yeah, and 460 00:21:18,876 --> 00:21:22,716 Speaker 1: so our product is the digital operators. So what I 461 00:21:22,756 --> 00:21:24,876 Speaker 1: mean by this is that our customer is a general 462 00:21:24,916 --> 00:21:28,036 Speaker 1: contractor or subcontractor, so it's the companies that already buy 463 00:21:28,036 --> 00:21:32,316 Speaker 1: and manage these machines we are we sell them a 464 00:21:32,316 --> 00:21:35,196 Speaker 1: retrofit of these machines, so we install an upfit that 465 00:21:35,516 --> 00:21:36,756 Speaker 1: adds sensors and compute. 466 00:21:36,996 --> 00:21:40,516 Speaker 2: So the contractor already owns the excavator. 467 00:21:39,996 --> 00:21:42,196 Speaker 1: That's right, They've already buy a five hundred thousand dollars 468 00:21:42,236 --> 00:21:44,396 Speaker 1: machine or three hundred thousand dollars machine, and so this 469 00:21:44,476 --> 00:21:48,036 Speaker 1: is an upgrade that enables autonomy. And then our business 470 00:21:48,076 --> 00:21:52,516 Speaker 1: model is selling labor, So we're effectively selling the labor 471 00:21:52,556 --> 00:21:55,316 Speaker 1: and then a variety of digital services around it that 472 00:21:55,436 --> 00:21:58,116 Speaker 1: operates as machine. And for the surface area of types 473 00:21:58,156 --> 00:22:01,236 Speaker 1: of tasks that it's approved for, it's completely driverlests. And 474 00:22:01,236 --> 00:22:03,556 Speaker 1: then that surface area increases over time with software updates, 475 00:22:03,636 --> 00:22:06,316 Speaker 1: and for everything else, it can still be manually operated. 476 00:22:06,476 --> 00:22:08,436 Speaker 2: So is the core product? Are they paying you by 477 00:22:08,476 --> 00:22:09,596 Speaker 2: the hour for digging? 478 00:22:10,076 --> 00:22:12,556 Speaker 1: So we're figuring it out, but it'll be something that 479 00:22:12,676 --> 00:22:15,916 Speaker 1: is either by the hour, by the project by subscriptions. 480 00:22:15,956 --> 00:22:19,916 Speaker 1: So it's effectively the way that today projects are forecasted 481 00:22:19,956 --> 00:22:24,356 Speaker 1: and build by shifts, you know, for labor it's a 482 00:22:24,396 --> 00:22:26,956 Speaker 1: parallel of that, but it becomes a huge win because 483 00:22:27,636 --> 00:22:30,316 Speaker 1: you have complete flexibility. You can work ten hours a day, 484 00:22:30,316 --> 00:22:32,596 Speaker 1: you can work twenty four hours a day. So there's 485 00:22:32,716 --> 00:22:34,996 Speaker 1: all sorts of benefits that will operationally give you a 486 00:22:34,996 --> 00:22:37,836 Speaker 1: lot of leeway on how you use it, what might 487 00:22:37,956 --> 00:22:40,796 Speaker 1: go wrong, what might go wrong. I'll tell you the 488 00:22:40,796 --> 00:22:43,116 Speaker 1: things that are like really challenging. They like we worry 489 00:22:43,116 --> 00:22:47,116 Speaker 1: about it, we think about the nuance and diversity of 490 00:22:47,196 --> 00:22:50,756 Speaker 1: things are high. As much as you want to just 491 00:22:51,916 --> 00:22:54,156 Speaker 1: boil it down to a very simple dig and load 492 00:22:54,236 --> 00:22:58,316 Speaker 1: sort of operation, there's always little corner cases. There's things 493 00:22:58,396 --> 00:23:02,036 Speaker 1: you find in the ground, there's weird ways that trucks 494 00:23:02,036 --> 00:23:04,756 Speaker 1: can interact with you, there's things people will do. There's 495 00:23:04,836 --> 00:23:07,916 Speaker 1: varieties of machines, but there's like fifty kinds of excavator 496 00:23:07,956 --> 00:23:11,036 Speaker 1: models and sizes and so forth. So I think the 497 00:23:11,076 --> 00:23:12,556 Speaker 1: long tail is still challenging. 498 00:23:13,156 --> 00:23:16,156 Speaker 2: Presumably there's you don't have to figure out every edge case, 499 00:23:16,156 --> 00:23:17,916 Speaker 2: but you probably have to figure out a lot for 500 00:23:17,996 --> 00:23:21,516 Speaker 2: your thing to be to work in a functional sense, right. 501 00:23:22,356 --> 00:23:25,596 Speaker 1: Yeah, And so those in the meantime, you're not just 502 00:23:25,596 --> 00:23:27,396 Speaker 1: digging and loading and got a reposition you got to 503 00:23:27,516 --> 00:23:29,516 Speaker 1: organize the earth, You got to think about the sequencing, 504 00:23:29,516 --> 00:23:32,316 Speaker 1: you got to deal with you know, daytime, night time rain, 505 00:23:32,796 --> 00:23:35,116 Speaker 1: and so you have like these types of like really 506 00:23:35,196 --> 00:23:37,996 Speaker 1: challenging diversities you have to think about and deal with. 507 00:23:38,116 --> 00:23:41,596 Speaker 1: So I think all in all, it's still a complicated 508 00:23:42,556 --> 00:23:45,276 Speaker 1: product area where there's a huge amount of diversity of 509 00:23:45,356 --> 00:23:47,236 Speaker 1: the things that need to be done. But it's one 510 00:23:47,276 --> 00:23:50,076 Speaker 1: of those where, like I personally think there's a handful 511 00:23:50,116 --> 00:23:52,716 Speaker 1: of these holy grails of autonomy and physical industries that 512 00:23:53,596 --> 00:23:59,836 Speaker 1: are like genuinely transformational opportunities for both impact positive impact 513 00:23:59,876 --> 00:24:01,876 Speaker 1: to the country and the world, and also just kind 514 00:24:01,876 --> 00:24:04,036 Speaker 1: of scale of industries that are like double digit percentages 515 00:24:04,036 --> 00:24:06,996 Speaker 1: of GDP. Transportation is, without a doubt one of them, 516 00:24:07,236 --> 00:24:10,396 Speaker 1: Construction is one of them, Culture is not far behind. 517 00:24:10,476 --> 00:24:12,916 Speaker 1: You know, mining is a very very significant as well. 518 00:24:13,316 --> 00:24:16,116 Speaker 1: Manufacturing is one of them. And so I think that 519 00:24:16,156 --> 00:24:18,476 Speaker 1: we're going to see a wave over this next ten 520 00:24:18,556 --> 00:24:22,556 Speaker 1: years in autonomy, but it's going to be tackling this 521 00:24:22,636 --> 00:24:24,876 Speaker 1: like seventy five percent of the world's GDP that's physical 522 00:24:24,876 --> 00:24:26,956 Speaker 1: and not digital, and there's a lot of work, like 523 00:24:27,036 --> 00:24:29,796 Speaker 1: a lot of positive impact that can happen across these spaces. 524 00:24:29,876 --> 00:24:31,516 Speaker 2: I mean, give me a little more on that one 525 00:24:31,676 --> 00:24:33,996 Speaker 2: pick a time in the future, five years, ten years, 526 00:24:34,356 --> 00:24:35,156 Speaker 2: not more than ten. 527 00:24:35,516 --> 00:24:38,116 Speaker 1: So my personal belief is that this idea, like there's 528 00:24:38,116 --> 00:24:39,756 Speaker 1: a lot of companies getting flooded for this, but the 529 00:24:39,796 --> 00:24:43,316 Speaker 1: idea that like this giant brain for all of robotics, 530 00:24:43,476 --> 00:24:47,316 Speaker 1: the foundation model for robotics, like, I personally do not 531 00:24:47,396 --> 00:24:50,116 Speaker 1: believe that that's viable in the next ten years because 532 00:24:50,156 --> 00:24:55,836 Speaker 1: you have such complexity in really understanding these verticals on 533 00:24:56,596 --> 00:24:59,716 Speaker 1: the inputs, the hardware, the products, the neat use case, 534 00:24:59,756 --> 00:25:02,716 Speaker 1: the customers, everything, that every single one of those have 535 00:25:02,836 --> 00:25:05,396 Speaker 1: such complexity to get data that the idea to get 536 00:25:05,476 --> 00:25:09,836 Speaker 1: enough data that bulldos is through a generalization problem. That's 537 00:25:09,836 --> 00:25:12,276 Speaker 1: a long ways off. But I do think that word 538 00:25:12,356 --> 00:25:14,516 Speaker 1: a perfect time for these vertical solutions where if you 539 00:25:14,556 --> 00:25:17,396 Speaker 1: have a focused solution where you're trying to do construction, 540 00:25:17,476 --> 00:25:20,396 Speaker 1: you're trying to do fulfillment in a warehouse, you're trying 541 00:25:20,396 --> 00:25:23,356 Speaker 1: to do a very focused manufacturing solution, I think that's 542 00:25:23,396 --> 00:25:26,676 Speaker 1: actually there's a giant stuff function and how powerful and 543 00:25:26,716 --> 00:25:29,036 Speaker 1: male technologies are. But if you're trying to do a 544 00:25:29,076 --> 00:25:31,556 Speaker 1: humanoid to do everything in a home for a consumer 545 00:25:31,596 --> 00:25:33,276 Speaker 1: market too far away. 546 00:25:33,836 --> 00:25:36,716 Speaker 2: Right now you're saying you basically you got to solve 547 00:25:36,756 --> 00:25:39,636 Speaker 2: one problem at a time. Yeah, I want to go 548 00:25:39,716 --> 00:25:42,796 Speaker 2: back to open road autonomy almost done. When do you 549 00:25:42,836 --> 00:25:47,316 Speaker 2: think that most rides most people take in cars or 550 00:25:47,356 --> 00:25:51,716 Speaker 2: trucks will be in driverless cars autonomous vehicles? 551 00:25:52,156 --> 00:25:55,956 Speaker 1: Great question. There's a few pre requisites for most rides. 552 00:25:55,956 --> 00:25:58,356 Speaker 1: So first, all the goofs is have to disappear and 553 00:25:58,356 --> 00:26:01,116 Speaker 1: the thing just works everywhere that's on a trajectory of 554 00:26:01,116 --> 00:26:03,076 Speaker 1: getting there because it's getting more and more efficient as 555 00:26:03,316 --> 00:26:05,716 Speaker 1: the scales across the country and then the world. It 556 00:26:05,756 --> 00:26:08,156 Speaker 1: has to go from ride healing to personal car ownership 557 00:26:08,756 --> 00:26:13,236 Speaker 1: that is both cost down and versatility and everything that. Like, 558 00:26:13,516 --> 00:26:14,516 Speaker 1: but that's gonna happen. 559 00:26:14,596 --> 00:26:18,356 Speaker 2: In that universe. When do they take the steering wheel out? 560 00:26:18,556 --> 00:26:20,236 Speaker 2: I just took a way mover the first time I 561 00:26:20,236 --> 00:26:21,356 Speaker 2: was in San Francisco this summer. 562 00:26:21,356 --> 00:26:22,836 Speaker 1: It's awesome, right, it was awesome. 563 00:26:22,916 --> 00:26:25,916 Speaker 2: Yeah, And like the weirdest thing about it to me 564 00:26:26,116 --> 00:26:28,036 Speaker 2: was the steering wheel. Yeah, right, Like if it was 565 00:26:28,116 --> 00:26:30,076 Speaker 2: just a box that I got in that looked like 566 00:26:30,076 --> 00:26:31,836 Speaker 2: a train car, someone I would have been like, oh sure, 567 00:26:31,836 --> 00:26:34,036 Speaker 2: it's like the air train or whatever, but that there's 568 00:26:34,316 --> 00:26:36,316 Speaker 2: a steering wheel and it's like a ghost is turning 569 00:26:36,316 --> 00:26:38,516 Speaker 2: the steeringhell, Like what's the steering wheel doing there? 570 00:26:39,036 --> 00:26:42,076 Speaker 1: So the steering wheels will disappear pretty quickly because ride 571 00:26:42,116 --> 00:26:44,956 Speaker 1: sharing autonomous cars, yeah, have no use for it, and 572 00:26:44,996 --> 00:26:46,796 Speaker 1: you need to point you know, there's like special cases. 573 00:26:46,836 --> 00:26:49,116 Speaker 1: We need to recover them. But it's a wasted space, right, 574 00:26:49,196 --> 00:26:53,276 Speaker 1: It's a wasted spot. So that'll happen soon. But the 575 00:26:53,356 --> 00:26:57,196 Speaker 1: idea of really deep penetration, I think it's personal cars. 576 00:26:57,716 --> 00:27:00,716 Speaker 1: It's beyond luxury, which is another generation to like actually 577 00:27:00,796 --> 00:27:03,156 Speaker 1: make it be something that's affordable. Then you got to 578 00:27:03,196 --> 00:27:05,476 Speaker 1: go through a buying cycle, which is like five to 579 00:27:05,556 --> 00:27:08,316 Speaker 1: seven years, and so I think it starts to get 580 00:27:08,356 --> 00:27:11,996 Speaker 1: serious penetrats in the back half of the thirties and 581 00:27:12,076 --> 00:27:16,956 Speaker 1: it'll be the forties where like, okay, fifty percent of 582 00:27:17,316 --> 00:27:19,356 Speaker 1: driving is kind of like autonomous. I think it's still 583 00:27:19,356 --> 00:27:21,996 Speaker 1: twenty years off in my mind. For example, you bought 584 00:27:21,996 --> 00:27:23,796 Speaker 1: a car today, it's going to be in circulation for 585 00:27:23,836 --> 00:27:26,836 Speaker 1: the next twelve to fifteen years. It just takes a while. 586 00:27:31,116 --> 00:27:46,156 Speaker 2: We'll be back in a minute with the lightning round. Okay, 587 00:27:46,236 --> 00:27:49,356 Speaker 2: let's finish with the lightning round. If you could operate 588 00:27:49,596 --> 00:27:51,316 Speaker 2: any machine, what would it be? 589 00:27:53,156 --> 00:27:56,116 Speaker 1: Oh gosh, this is a fun one. If I could 590 00:27:56,156 --> 00:27:58,476 Speaker 1: operate any machine at all in the world, not even 591 00:27:58,516 --> 00:28:02,196 Speaker 1: in contry anything, Oh gosh, Okay, you know what, I 592 00:28:02,236 --> 00:28:05,396 Speaker 1: would love to either operate one of the like Boring 593 00:28:05,436 --> 00:28:09,436 Speaker 1: Company gigantic drill machines. Was like so astronomical or there's 594 00:28:09,476 --> 00:28:12,156 Speaker 1: like my trucks that are so astronomically big that they 595 00:28:12,236 --> 00:28:14,476 Speaker 1: just dwarfed any machine in the world, and it costs 596 00:28:14,476 --> 00:28:16,916 Speaker 1: like five million dollars and just a skill that would 597 00:28:16,956 --> 00:28:18,836 Speaker 1: be really really fun to try at some point. 598 00:28:19,116 --> 00:28:22,636 Speaker 2: Just to be that high, just have that much momentum, right, 599 00:28:22,716 --> 00:28:24,916 Speaker 2: that much mass at your disposal. 600 00:28:25,636 --> 00:28:28,236 Speaker 1: Yeah, like literally the tire is like three stories tall. 601 00:28:28,276 --> 00:28:29,956 Speaker 1: It's like it's wild, Like it's absurd. 602 00:28:30,316 --> 00:28:33,236 Speaker 2: What's one thing you remember about immigrating from the Soviet 603 00:28:33,316 --> 00:28:35,636 Speaker 2: Union to the US when you were six? 604 00:28:36,236 --> 00:28:39,236 Speaker 1: I was six. Yeah, I was born in Moscow. We immigrated. 605 00:28:39,276 --> 00:28:43,196 Speaker 1: I was super young. I remember we ended up having 606 00:28:43,196 --> 00:28:46,236 Speaker 1: a pet stop in Europe where there's a standard path 607 00:28:46,276 --> 00:28:47,996 Speaker 1: of going to be out of Venice in Rome, why 608 00:28:47,996 --> 00:28:50,236 Speaker 1: your paperwork gets processed, and then we went to New York. 609 00:28:50,596 --> 00:28:53,556 Speaker 1: I remember running around the rooftops of Venice with a 610 00:28:53,596 --> 00:28:57,476 Speaker 1: friend of mine, causing a bunch of you know, trouble 611 00:28:57,516 --> 00:28:59,956 Speaker 1: and running away and disappearing for long periods of time 612 00:29:00,156 --> 00:29:02,476 Speaker 1: and having a boss for for whatever reason, running around 613 00:29:02,516 --> 00:29:04,516 Speaker 1: the rooftops of Venice was in my was ingrained in 614 00:29:04,556 --> 00:29:07,156 Speaker 1: my memory, which was a kind of very positive memory 615 00:29:07,196 --> 00:29:09,116 Speaker 1: while my parents were going through a whole bunch of strusts. 616 00:29:09,116 --> 00:29:12,356 Speaker 2: It sounds very free. Not to project East versus West 617 00:29:12,436 --> 00:29:14,876 Speaker 2: language onto it, but it sounds very free. 618 00:29:15,396 --> 00:29:18,356 Speaker 1: It was very freeing. It's It's funny. I just hit 619 00:29:18,396 --> 00:29:21,556 Speaker 1: the age my dad was when they immigrated from the 620 00:29:21,636 --> 00:29:25,116 Speaker 1: Soviet Union with two kids. My sister was one month old, 621 00:29:25,596 --> 00:29:29,836 Speaker 1: zero money, almost no English, having a fresh start. 622 00:29:29,756 --> 00:29:32,436 Speaker 2: So courageous, right, doesn't it seem so brave? 623 00:29:33,436 --> 00:29:35,396 Speaker 1: Yeah? It does. And they were trying to leave for 624 00:29:35,396 --> 00:29:38,196 Speaker 1: like ten years and couldn't leave. Yeah, it's kind of fascinating. 625 00:29:38,236 --> 00:29:40,756 Speaker 1: So we're very fortunate to not have to go through 626 00:29:40,836 --> 00:29:41,436 Speaker 1: something like that. 627 00:29:41,716 --> 00:29:43,316 Speaker 2: Is that when you think about it that way, does 628 00:29:43,316 --> 00:29:45,516 Speaker 2: that like put pressure on you? You think, oh my god, 629 00:29:45,556 --> 00:29:49,116 Speaker 2: my parents did all this better? I better deliver. 630 00:29:50,236 --> 00:29:52,636 Speaker 1: It's funny, Uh, a little bit. I mean you kind 631 00:29:52,636 --> 00:29:56,796 Speaker 1: of have this I don't know, like adventurous spirit. I 632 00:29:56,796 --> 00:29:59,636 Speaker 1: guess maybe it gets ingrained. I think of it now 633 00:29:59,636 --> 00:30:01,516 Speaker 1: for my kids, where I'm like, okay, Like now they're 634 00:30:01,756 --> 00:30:05,236 Speaker 1: in a really nice and comfortable environment growing up. How 635 00:30:05,236 --> 00:30:07,036 Speaker 1: do you I convey that edge in a little bit 636 00:30:07,076 --> 00:30:08,196 Speaker 1: of that spirit, Like, how. 637 00:30:08,076 --> 00:30:09,316 Speaker 2: Do you keep them from going so. 638 00:30:09,996 --> 00:30:13,036 Speaker 1: Yeah, it's something you want them to go through anything difficult, 639 00:30:13,076 --> 00:30:14,716 Speaker 1: because it's not that I was, you know, for me, 640 00:30:14,756 --> 00:30:17,756 Speaker 1: it was actually just an adventure my guards. It was difficult, 641 00:30:17,756 --> 00:30:20,916 Speaker 1: But part of it is just, yeah, like conveying that 642 00:30:21,236 --> 00:30:24,076 Speaker 1: spirit of being able to like be comfortable trying to 643 00:30:24,156 --> 00:30:27,516 Speaker 1: tackle something new and being thrown in a completely different environment. 644 00:30:27,836 --> 00:30:29,756 Speaker 1: It's hard to force ound or simulate that when you're 645 00:30:29,796 --> 00:30:31,796 Speaker 1: just growing up in the Semptusco area. Right. 646 00:30:32,036 --> 00:30:34,196 Speaker 2: I appreciate your time. Thanks for talking to me. 647 00:30:34,476 --> 00:30:35,956 Speaker 1: It's a pleasure. This is a lot of fun. Thanks 648 00:30:35,996 --> 00:30:43,196 Speaker 1: for having me. 649 00:30:43,396 --> 00:30:47,916 Speaker 2: Boris Soffman is the co founder and CEO of Bedrock Robotics. 650 00:30:48,596 --> 00:30:51,396 Speaker 2: Just a quick note that this is our last episode 651 00:30:51,516 --> 00:30:53,796 Speaker 2: before a break of a couple of weeks and then 652 00:30:53,836 --> 00:30:56,796 Speaker 2: we'll be back with more episodes. Please email us at 653 00:30:56,956 --> 00:31:00,116 Speaker 2: problem at Pushkin dot Fm. We are always looking for 654 00:31:00,236 --> 00:31:04,236 Speaker 2: new guests for the show. Today's show was produced by 655 00:31:04,276 --> 00:31:08,156 Speaker 2: Trinamanino and Gabriel Hunter Chang, who was edited by Alexander 656 00:31:08,156 --> 00:31:11,996 Speaker 2: Garretton and engineered by Sarah Brugueri. I'm Jacob Goldstein and 657 00:31:12,076 --> 00:31:14,236 Speaker 2: we'll be back next week with another episode of What's 658 00:31:14,276 --> 00:31:14,676 Speaker 2: Your Problem.