1 00:00:15,356 --> 00:00:22,036 Speaker 1: Pushkin. In two thousand and three, Chris Armsen was a 2 00:00:22,076 --> 00:00:25,836 Speaker 1: grad student at Carnegie Mellon University and he was working 3 00:00:25,836 --> 00:00:29,476 Speaker 1: on a research project in Chile in the Ata Kama Desert. 4 00:00:29,836 --> 00:00:33,196 Speaker 2: Beautiful place, very desolate, the closest thing we have to 5 00:00:33,236 --> 00:00:36,156 Speaker 2: Mars in many ways on Earth, which is why you 6 00:00:36,196 --> 00:00:38,996 Speaker 2: were there. We were building a robot to go look 7 00:00:39,036 --> 00:00:41,556 Speaker 2: for signs of life, and I was developing the navigation 8 00:00:41,636 --> 00:00:45,076 Speaker 2: system for it, and my PhD advisor came down and said, Hey, 9 00:00:45,116 --> 00:00:47,436 Speaker 2: the Defense Departments has this competition. They want to build 10 00:00:47,436 --> 00:00:50,076 Speaker 2: a robot to drive fifty miles an hour across the 11 00:00:50,116 --> 00:00:52,596 Speaker 2: desert between Los Angeles Las Vegas. And I thought that 12 00:00:52,676 --> 00:00:56,956 Speaker 2: just sounded awesome. So we bought a Humvy and we 13 00:00:57,036 --> 00:00:59,196 Speaker 2: cut the top of it, put a big electronics box 14 00:00:59,236 --> 00:01:01,756 Speaker 2: in a lot of computers. All of the stuff that's 15 00:01:01,796 --> 00:01:04,756 Speaker 2: in a self driving car today. The high performs computing, 16 00:01:05,356 --> 00:01:09,396 Speaker 2: the lasers, the radars, the cameras, the hd map apps 17 00:01:09,436 --> 00:01:11,716 Speaker 2: were part of this. It's kind of the first time 18 00:01:11,716 --> 00:01:15,676 Speaker 2: when all this stuff came together into a vehicle, and 19 00:01:15,716 --> 00:01:18,876 Speaker 2: then we developed it for six or nine months, and 20 00:01:18,876 --> 00:01:21,596 Speaker 2: then we tried to compete with it. The first darper challenge. 21 00:01:22,516 --> 00:01:25,716 Speaker 2: The team I was a technical director for we actually 22 00:01:25,716 --> 00:01:28,876 Speaker 2: had the highest performing thing there. It was supposed to 23 00:01:28,916 --> 00:01:30,876 Speaker 2: go one hundred and twenty six miles across the desert. 24 00:01:31,036 --> 00:01:35,236 Speaker 2: It ended up going about seven when it got hung 25 00:01:35,316 --> 00:01:38,196 Speaker 2: up on a berm and almost literally burst into flames. 26 00:01:38,876 --> 00:01:41,596 Speaker 1: And you, I mean, nobody won. But if anybody won, 27 00:01:41,676 --> 00:01:43,956 Speaker 1: you won, right, that was more than nobody. That was 28 00:01:43,996 --> 00:01:46,156 Speaker 1: farther than Yeah. The key thing is nobody won. 29 00:01:46,396 --> 00:01:48,796 Speaker 2: If you imagine a marathon where you're supposed to go 30 00:01:48,836 --> 00:01:50,996 Speaker 2: twenty six miles and the best runner went too right, 31 00:01:51,036 --> 00:01:52,516 Speaker 2: it's yes. 32 00:01:52,436 --> 00:01:55,956 Speaker 1: Yes, fair, and then burst into flames and then burst 33 00:01:55,996 --> 00:01:56,516 Speaker 1: into flames. 34 00:01:56,556 --> 00:02:02,996 Speaker 2: Yeah. 35 00:02:03,036 --> 00:02:05,276 Speaker 1: I'm Jacob Goldstein, and this is What's your problem? The 36 00:02:05,316 --> 00:02:07,076 Speaker 1: show where I talk to people who are trying to 37 00:02:07,116 --> 00:02:11,556 Speaker 1: make technological progress. My guest today is Chris Ermson. He's 38 00:02:11,596 --> 00:02:14,636 Speaker 1: the co founder and CEO of Aurora, a company that's 39 00:02:14,636 --> 00:02:18,396 Speaker 1: trying to bring autonomous driving to commercial trucks. As you 40 00:02:18,476 --> 00:02:21,636 Speaker 1: just heard, Chris started working on autonomous vehicles more than 41 00:02:21,716 --> 00:02:25,756 Speaker 1: twenty years ago, and in fact, his career traces the 42 00:02:25,956 --> 00:02:29,596 Speaker 1: really the entire modern arc of self driving cars, from 43 00:02:29,596 --> 00:02:33,196 Speaker 1: that first DARPA challenge to a top job at Google's 44 00:02:33,196 --> 00:02:36,276 Speaker 1: self driving car project, the project that later became Waimo 45 00:02:36,756 --> 00:02:40,236 Speaker 1: to founding Aurora, the company where he works now. We 46 00:02:40,276 --> 00:02:42,556 Speaker 1: talked a lot about the problems Chris is trying to 47 00:02:42,596 --> 00:02:45,956 Speaker 1: solve in autonomous trucking, and we also talked about the 48 00:02:45,996 --> 00:02:49,556 Speaker 1: future of autonomous driving more generally. But before we got 49 00:02:49,596 --> 00:02:52,036 Speaker 1: to that, we talked about the earlier work he did, 50 00:02:52,396 --> 00:02:55,196 Speaker 1: work that really helped get the field, the broader field, 51 00:02:55,276 --> 00:02:57,756 Speaker 1: to where it is today. So we'll pick up the 52 00:02:57,796 --> 00:03:01,316 Speaker 1: interview back in two thousand and four at that moment 53 00:03:01,396 --> 00:03:05,516 Speaker 1: when his team's autonomous car had failed and almost burst 54 00:03:05,516 --> 00:03:06,116 Speaker 1: into flames. 55 00:03:06,476 --> 00:03:09,596 Speaker 2: It was crushing, yeah, right, Like, you know, on the 56 00:03:09,636 --> 00:03:12,716 Speaker 2: one hand, it was called a Grand challenge, right because 57 00:03:13,076 --> 00:03:14,396 Speaker 2: you know, a lot of people didn't think it could 58 00:03:14,436 --> 00:03:17,356 Speaker 2: be solved. It was impossible. Because on the one hand, 59 00:03:17,356 --> 00:03:20,516 Speaker 2: it's kind of like, okay, it was really hard. On 60 00:03:20,556 --> 00:03:23,396 Speaker 2: the other hand, you've you know, we're sleeping on the 61 00:03:23,436 --> 00:03:25,356 Speaker 2: floor in the lab, in the you know, in this 62 00:03:25,396 --> 00:03:28,716 Speaker 2: garage in the middle of the desert for weeks and 63 00:03:28,756 --> 00:03:30,356 Speaker 2: months trying to get this thing to go, and then 64 00:03:30,356 --> 00:03:32,756 Speaker 2: you launch it out there and you have this this 65 00:03:32,836 --> 00:03:35,756 Speaker 2: epic moment where you're watching this thing that you've built, 66 00:03:35,836 --> 00:03:37,716 Speaker 2: tear out across the desert and you can just see 67 00:03:37,716 --> 00:03:41,596 Speaker 2: the top of it, and these helicopters that the military 68 00:03:41,676 --> 00:03:43,436 Speaker 2: is using to track them as they're out in the 69 00:03:43,476 --> 00:03:47,196 Speaker 2: desert doing this, and then you hear you know, it's 70 00:03:47,196 --> 00:03:48,916 Speaker 2: supposed to one hundred and twenty miles, and they're like, oh, 71 00:03:48,916 --> 00:03:52,516 Speaker 2: it's stuck. And then you hear like billowing smoke and 72 00:03:52,556 --> 00:03:55,756 Speaker 2: broken and poor thing comes back looking very sad, and 73 00:03:56,236 --> 00:03:59,076 Speaker 2: you know, it was crushing. But at the same time, 74 00:04:00,276 --> 00:04:02,756 Speaker 2: we did set records. Right, if you look at the 75 00:04:02,836 --> 00:04:04,876 Speaker 2: kind of the product of the speed and the distance 76 00:04:04,916 --> 00:04:07,036 Speaker 2: it went, this was an order of magnitude better than 77 00:04:07,076 --> 00:04:10,276 Speaker 2: anyone had done before, and so it was a big step. 78 00:04:10,356 --> 00:04:12,636 Speaker 2: And I think that was partly what encouraged the Defense 79 00:04:12,636 --> 00:04:15,156 Speaker 2: Department to say, hey, come back again and try it 80 00:04:15,156 --> 00:04:15,556 Speaker 2: in a year. 81 00:04:17,196 --> 00:04:19,716 Speaker 1: And then you did, and then you did it again 82 00:04:19,836 --> 00:04:21,476 Speaker 1: and again. And what year do we go to? Is 83 00:04:21,516 --> 00:04:23,996 Speaker 1: it two thousand and seven? Is that at the next 84 00:04:23,996 --> 00:04:24,596 Speaker 1: big moment? 85 00:04:24,836 --> 00:04:26,796 Speaker 2: Yeah? Two thousand and five was another one of these 86 00:04:26,796 --> 00:04:30,116 Speaker 2: in the desert. Two thousand and seven they did what 87 00:04:30,116 --> 00:04:32,676 Speaker 2: they called the Urban Challenge. So two thousand and five 88 00:04:32,676 --> 00:04:34,836 Speaker 2: we're able to drive it around the desert not hit 89 00:04:34,876 --> 00:04:37,636 Speaker 2: things too much. Two thousand and seven, they say, okay, 90 00:04:37,676 --> 00:04:39,556 Speaker 2: that's great, but now let's stay on the right side 91 00:04:39,556 --> 00:04:41,756 Speaker 2: of the road. Let's move in kind of pseudo urban 92 00:04:41,876 --> 00:04:46,516 Speaker 2: environments and show that that can work. And that was 93 00:04:46,556 --> 00:04:48,956 Speaker 2: the last of Grand challenges, because that year we actually 94 00:04:49,116 --> 00:04:51,676 Speaker 2: accomplished it. We drove sixty miles around this air base 95 00:04:51,756 --> 00:04:54,276 Speaker 2: and interacted with traffic. They hired a bunch of stunt 96 00:04:54,316 --> 00:04:56,236 Speaker 2: drivers out of Hollywood to come and drive cars around 97 00:04:56,276 --> 00:05:00,356 Speaker 2: these robots to make traffic for them, and it was 98 00:05:00,356 --> 00:05:03,316 Speaker 2: a heck of a day. And then the Defense Department, 99 00:05:03,396 --> 00:05:06,756 Speaker 2: or at least darpas said, you know, effectively, problem solved. 100 00:05:07,836 --> 00:05:09,676 Speaker 2: We've got this close enough. Now hand it off to 101 00:05:09,756 --> 00:05:11,636 Speaker 2: industry or whoever to take it from here. 102 00:05:12,516 --> 00:05:14,356 Speaker 1: And how did how did your car do? 103 00:05:14,436 --> 00:05:17,116 Speaker 2: In two thousand and seven, we won, which was nice. 104 00:05:17,156 --> 00:05:17,916 Speaker 2: It was a good change. 105 00:05:18,436 --> 00:05:21,796 Speaker 1: So you know, we went and did it catch on fire? 106 00:05:22,116 --> 00:05:24,996 Speaker 2: No fire? This time, it just kind of worked. We 107 00:05:25,076 --> 00:05:29,436 Speaker 2: had a little bit of nerves at the very beginning, 108 00:05:29,516 --> 00:05:32,636 Speaker 2: because you know, our system was pretty reliable, robust, it 109 00:05:32,716 --> 00:05:35,476 Speaker 2: worked all the time. We go and bring it out 110 00:05:35,516 --> 00:05:38,116 Speaker 2: to the start line and we can't get it to 111 00:05:38,196 --> 00:05:42,236 Speaker 2: boot properly because it can't see this GPS satellites that 112 00:05:42,236 --> 00:05:46,276 Speaker 2: it needed, and we never had this problem before. We're like, 113 00:05:46,756 --> 00:05:48,756 Speaker 2: you know, frantically trying to figure out what's going on. 114 00:05:48,796 --> 00:05:52,196 Speaker 2: And then we look over and realized that for race 115 00:05:52,276 --> 00:05:56,716 Speaker 2: day they brought a JumboTron and they've set that up 116 00:05:56,836 --> 00:05:59,196 Speaker 2: right where the robots are at the starting place. And 117 00:05:59,276 --> 00:06:01,996 Speaker 2: it turns out they don't really check these mobile jumbotrons 118 00:06:02,036 --> 00:06:05,556 Speaker 2: for like how much interference they're generating, I guess. So 119 00:06:05,596 --> 00:06:07,356 Speaker 2: we had this panic moment we said, you know, can 120 00:06:07,396 --> 00:06:09,556 Speaker 2: you turn off the jumbo tron? And sure enough they 121 00:06:09,556 --> 00:06:13,756 Speaker 2: turn off that and suddenly everything works again. So you know, 122 00:06:13,956 --> 00:06:18,156 Speaker 2: it was it wasn't without its challenges, so so but it. 123 00:06:18,156 --> 00:06:19,916 Speaker 1: Works right, So it works, and it works in a 124 00:06:19,996 --> 00:06:24,476 Speaker 1: like quasi real environment, a pseudo real environment. So I 125 00:06:24,516 --> 00:06:26,676 Speaker 1: mean when you at that moment, when you were thinking 126 00:06:26,716 --> 00:06:31,236 Speaker 1: about the future, the future of self driving cars, how 127 00:06:31,836 --> 00:06:33,116 Speaker 1: did it look to you? What did you think? 128 00:06:33,996 --> 00:06:37,476 Speaker 2: Yeah, I think at that point was starting to believe 129 00:06:37,476 --> 00:06:39,796 Speaker 2: there was a thing that could happen here, right, and 130 00:06:40,036 --> 00:06:44,316 Speaker 2: probably you know, certainly naively thought oh yeah, we've got 131 00:06:44,316 --> 00:06:44,876 Speaker 2: this right. 132 00:06:46,076 --> 00:06:49,716 Speaker 1: This maybe everybody kind of thought that, right, Like. 133 00:06:50,036 --> 00:06:53,676 Speaker 2: Yeah, I think, And that's been part of what's made 134 00:06:53,676 --> 00:06:56,396 Speaker 2: the journey really fun for me. It's just there's just 135 00:06:56,476 --> 00:06:59,516 Speaker 2: so many things to learn, so many new problems to 136 00:06:59,516 --> 00:07:03,036 Speaker 2: overcome along the way of taking something from a it 137 00:07:03,196 --> 00:07:05,996 Speaker 2: worked on this day in a race, to hey, we 138 00:07:06,036 --> 00:07:10,156 Speaker 2: can go do something transformational that will save lives and 139 00:07:10,196 --> 00:07:13,156 Speaker 2: improve the economy and you know, do all these other 140 00:07:13,196 --> 00:07:13,756 Speaker 2: great things. 141 00:07:14,876 --> 00:07:17,996 Speaker 1: It seems like that the distance between it worked in 142 00:07:18,036 --> 00:07:22,276 Speaker 1: a race and we can do something transformational has been longer, 143 00:07:22,636 --> 00:07:24,676 Speaker 1: certainly than a lot of people thought. Then a lot 144 00:07:24,676 --> 00:07:26,676 Speaker 1: of like smart, well informed people. 145 00:07:26,436 --> 00:07:32,276 Speaker 2: Thought, yeah, me among them, right. And you know, it's 146 00:07:32,276 --> 00:07:34,116 Speaker 2: one of these things where I think anyone who sets 147 00:07:34,156 --> 00:07:38,516 Speaker 2: out and does something interest and exciting, often I think 148 00:07:38,556 --> 00:07:40,196 Speaker 2: if they understood how hard it was going to be 149 00:07:40,196 --> 00:07:43,516 Speaker 2: before they started, may not have actually taken the first step. 150 00:07:43,556 --> 00:07:47,156 Speaker 2: And so I feel a little lucky to have not 151 00:07:47,236 --> 00:07:49,636 Speaker 2: understood that, because it's caused me to kind of stick 152 00:07:49,676 --> 00:07:51,956 Speaker 2: with it and really enjoy the journey along the way. 153 00:07:52,436 --> 00:07:55,316 Speaker 1: I mean, entrepreneurs basically always say that thing that you 154 00:07:55,476 --> 00:07:58,156 Speaker 1: just said, but in the case of autonomous driving, I 155 00:07:58,196 --> 00:08:01,756 Speaker 1: really believe it. Right, It seems like extra true. Like 156 00:08:02,076 --> 00:08:04,876 Speaker 1: two thousand and seven was basically like the year the 157 00:08:04,996 --> 00:08:07,636 Speaker 1: iPhone came out, right, like, which is to say, in 158 00:08:08,076 --> 00:08:11,116 Speaker 1: technology long time ago, not to mention that there was 159 00:08:11,196 --> 00:08:14,636 Speaker 1: like no computer vision as we know it today, right, 160 00:08:14,676 --> 00:08:17,556 Speaker 1: which has become important, I mean so much. 161 00:08:18,196 --> 00:08:23,396 Speaker 2: No, it is. You know, the waves of technological evolution 162 00:08:23,436 --> 00:08:26,356 Speaker 2: that we've seen since then are profound, and we've been 163 00:08:26,396 --> 00:08:28,436 Speaker 2: able to kind of surf those as we've been building it. 164 00:08:28,476 --> 00:08:35,716 Speaker 2: And it's been Yeah, it is, it's been a long time. 165 00:08:39,156 --> 00:08:41,036 Speaker 1: So then you go to Google, right, you go to 166 00:08:41,036 --> 00:08:43,196 Speaker 1: Google a couple couple of years after that two thousand 167 00:08:43,236 --> 00:08:46,756 Speaker 1: and nine. So you're at Google from twenty nine to 168 00:08:46,836 --> 00:08:50,276 Speaker 1: twenty sixteen, Right, what do you and the team figure out? 169 00:08:50,356 --> 00:08:52,356 Speaker 1: And then yeah, when do you leave? 170 00:08:52,916 --> 00:08:58,676 Speaker 2: So we get there, and we originally want to show 171 00:08:58,756 --> 00:09:00,156 Speaker 2: that we could go from what we've shown at the 172 00:09:00,236 --> 00:09:04,196 Speaker 2: Darker Challenge to something actually worked on public roads, and 173 00:09:04,276 --> 00:09:07,476 Speaker 2: so we had this challenge of driving one hundred thousand 174 00:09:07,516 --> 00:09:10,836 Speaker 2: miles in public roads and miles of really interesting, challenging 175 00:09:10,876 --> 00:09:13,276 Speaker 2: roads in the Bay Area, things like driving down Highway 176 00:09:13,316 --> 00:09:16,596 Speaker 2: wand from San Francisco to La and across all the 177 00:09:16,636 --> 00:09:19,396 Speaker 2: Bay bridges and things like that, with the idea that 178 00:09:19,436 --> 00:09:22,356 Speaker 2: if we could prove we could do these things, then yeah, 179 00:09:22,396 --> 00:09:25,716 Speaker 2: this technology probably has legs. And it was kind of 180 00:09:25,756 --> 00:09:29,596 Speaker 2: a dread pirate Roberts setup right in that, you know, 181 00:09:29,916 --> 00:09:33,116 Speaker 2: we were told that you've got you know, two years 182 00:09:33,156 --> 00:09:35,836 Speaker 2: to do this, and you know then we'll probably fire 183 00:09:35,876 --> 00:09:38,716 Speaker 2: you the day after, right, and you kind of this 184 00:09:38,876 --> 00:09:41,156 Speaker 2: was kind of with us for the time. And so 185 00:09:41,276 --> 00:09:44,076 Speaker 2: we worked with urgency, and it turns out we did 186 00:09:44,116 --> 00:09:46,836 Speaker 2: in eighteen months, and through that we had some really 187 00:09:46,876 --> 00:09:51,236 Speaker 2: interesting technological evolution. You know, we started to see some 188 00:09:51,276 --> 00:09:55,036 Speaker 2: of the you know, early legs of deep learning started 189 00:09:55,036 --> 00:09:56,196 Speaker 2: to show up in that point. 190 00:09:56,996 --> 00:10:00,556 Speaker 1: During that time, so this is like two thousand and ten, 191 00:10:00,676 --> 00:10:02,396 Speaker 1: twenty eleven, yeah. 192 00:10:02,156 --> 00:10:03,596 Speaker 2: Exactly those times, yep. 193 00:10:03,836 --> 00:10:04,036 Speaker 1: Yeah. 194 00:10:04,956 --> 00:10:10,156 Speaker 2: And then but I think almost more importantly really started 195 00:10:10,236 --> 00:10:12,636 Speaker 2: we really started to understand what this would mean to people. 196 00:10:13,396 --> 00:10:16,156 Speaker 2: Right that we you know, as somebody who had worked 197 00:10:16,156 --> 00:10:19,516 Speaker 2: on this because the technology was cool and initially and 198 00:10:19,676 --> 00:10:22,436 Speaker 2: enjoyed the challenge of it, and seeing the benefit to 199 00:10:22,476 --> 00:10:24,956 Speaker 2: the military as we thought about you know, as we 200 00:10:24,996 --> 00:10:26,556 Speaker 2: started to get into the space and started to talk 201 00:10:26,556 --> 00:10:30,316 Speaker 2: to people in the public sector about the number of 202 00:10:30,316 --> 00:10:33,316 Speaker 2: lives lost on America's roads, you know, forty plus thousand 203 00:10:33,316 --> 00:10:40,476 Speaker 2: people every year. The challenge that people, whether because they 204 00:10:40,716 --> 00:10:44,276 Speaker 2: have physical limitation to prevent them from driving or because 205 00:10:44,316 --> 00:10:46,956 Speaker 2: they've aged out no longer feel comfortable driving, the implications 206 00:10:46,956 --> 00:10:50,396 Speaker 2: that has for the quality of life, and how we 207 00:10:50,476 --> 00:10:54,076 Speaker 2: can make that better for folks, you know, the accessibility 208 00:10:54,116 --> 00:10:57,956 Speaker 2: that we could bring to transportation that's kind of lost 209 00:10:57,996 --> 00:10:59,876 Speaker 2: with the public trans system we have in the US. 210 00:11:00,236 --> 00:11:02,596 Speaker 2: All of this was like, oh wow, there is not 211 00:11:02,716 --> 00:11:05,876 Speaker 2: just it's not just a cool, interesting technology. It's not 212 00:11:05,916 --> 00:11:07,716 Speaker 2: that I get to work with good people. It's not 213 00:11:07,796 --> 00:11:10,436 Speaker 2: just that you know, this could be valuable. It's that, like, 214 00:11:10,476 --> 00:11:14,116 Speaker 2: this is socially really important to do as well. And then, 215 00:11:14,436 --> 00:11:18,196 Speaker 2: you know, the course of the program, we got to 216 00:11:18,236 --> 00:11:20,396 Speaker 2: the point where we launched for the first time a 217 00:11:20,476 --> 00:11:24,276 Speaker 2: vehicle operating on public roads with nobody in it, well, 218 00:11:24,316 --> 00:11:27,236 Speaker 2: actually with somebody at it, with Steve Mahon, who is 219 00:11:27,276 --> 00:11:29,316 Speaker 2: a blind gentleman who'd worked with us on some of 220 00:11:29,316 --> 00:11:31,796 Speaker 2: the early kind of concepts and understanding of this, and 221 00:11:31,836 --> 00:11:34,876 Speaker 2: he took a road to Austin, Texas by himself and 222 00:11:35,236 --> 00:11:41,436 Speaker 2: seeing the emotional impact that had on him of you know, 223 00:11:41,596 --> 00:11:44,796 Speaker 2: once again having the freedom to get from me to 224 00:11:44,836 --> 00:11:47,036 Speaker 2: be without having asked somebody else. It was just like, wow, 225 00:11:47,116 --> 00:11:51,236 Speaker 2: this this actually matters. So that was very cool. 226 00:11:52,676 --> 00:11:56,436 Speaker 1: So so you mentioned the machine learning piece starting to 227 00:11:56,476 --> 00:11:59,116 Speaker 1: come in. It's I think I was looking back at 228 00:11:59,196 --> 00:12:02,196 Speaker 1: I think it's twenty twelve that alex net comes along, right, 229 00:12:02,236 --> 00:12:03,956 Speaker 1: the first thing. I think that's right, kind of big 230 00:12:04,196 --> 00:12:08,396 Speaker 1: computer vision breakthrough. And so it is interesting. I mean 231 00:12:08,436 --> 00:12:10,956 Speaker 1: when I I think as a sort of lay person 232 00:12:11,156 --> 00:12:13,876 Speaker 1: about autonomous cars, now, I think of computer vision as 233 00:12:13,916 --> 00:12:17,236 Speaker 1: like central like, how could you even do it without 234 00:12:17,276 --> 00:12:20,676 Speaker 1: that right, without computers being able to quote unquote understand 235 00:12:20,676 --> 00:12:23,196 Speaker 1: what they see. But this is actually emerging just as 236 00:12:23,196 --> 00:12:26,956 Speaker 1: you're building this car, right, this system, So how like 237 00:12:27,716 --> 00:12:29,676 Speaker 1: tell me about that? Tell me sort of building in 238 00:12:29,756 --> 00:12:33,476 Speaker 1: parallel with this technological enabling wave. 239 00:12:33,996 --> 00:12:36,716 Speaker 2: Yeah, there was a bunch of these things. So Moore's 240 00:12:36,796 --> 00:12:39,276 Speaker 2: law was continued in a march along and the amount 241 00:12:39,276 --> 00:12:42,756 Speaker 2: of computation we had in the vehicles improving the cell 242 00:12:42,756 --> 00:12:46,156 Speaker 2: phone was, believe it or not, was a big boon 243 00:12:46,236 --> 00:12:50,036 Speaker 2: to us because it drove the performance of cameras. 244 00:12:50,596 --> 00:12:54,436 Speaker 1: So, oh, interesting thing about like all those selfies everybody 245 00:12:54,476 --> 00:12:57,636 Speaker 1: just wanting to make better cheaper cameras and the processing 246 00:12:57,716 --> 00:12:59,036 Speaker 1: for images, yeah. 247 00:12:58,876 --> 00:13:01,956 Speaker 2: One hundred, but even just the physical imager itself, right, 248 00:13:02,036 --> 00:13:05,316 Speaker 2: went from you know, a one megapixel three twenty two 249 00:13:05,396 --> 00:13:07,516 Speaker 2: forty camera, which is what if you bought a top 250 00:13:07,596 --> 00:13:11,236 Speaker 2: the line you know, Kodak digital camera back in the day, 251 00:13:11,636 --> 00:13:17,236 Speaker 2: was now my iPhone's got forty eight megapixel, right, and so, 252 00:13:17,516 --> 00:13:22,956 Speaker 2: and the quality and color performance and everything that's improved dramatically. 253 00:13:23,956 --> 00:13:27,276 Speaker 2: And then light ar is another thing that has really 254 00:13:27,836 --> 00:13:31,196 Speaker 2: evolved over the course of the last twenty years, going 255 00:13:31,196 --> 00:13:34,076 Speaker 2: from things where we could get a single point along 256 00:13:34,116 --> 00:13:38,956 Speaker 2: the line to now full three D fields of view 257 00:13:39,276 --> 00:13:43,436 Speaker 2: and allows us to combine the camera data with the 258 00:13:43,476 --> 00:13:46,916 Speaker 2: light our data with the computation. And then the final 259 00:13:46,996 --> 00:13:49,676 Speaker 2: big thing that's moved forward is in the sensing side 260 00:13:49,756 --> 00:13:54,236 Speaker 2: is radar and the automotive radar. The quality that you 261 00:13:54,236 --> 00:13:56,396 Speaker 2: can get out of that, the price point that comes 262 00:13:56,396 --> 00:13:59,516 Speaker 2: in at now, you know, when you combine that set 263 00:13:59,556 --> 00:14:03,276 Speaker 2: of advances, there's orders of magnitude improvement in both the 264 00:14:03,396 --> 00:14:06,156 Speaker 2: quality of data we can consume and the amount of 265 00:14:06,236 --> 00:14:09,756 Speaker 2: data that we can perform computation over that really is 266 00:14:09,876 --> 00:14:12,596 Speaker 2: unlocked our ability to understand the world around these vehicles. 267 00:14:14,156 --> 00:14:21,156 Speaker 1: So you leave in twenty sixteen, like, are people getting 268 00:14:21,236 --> 00:14:21,996 Speaker 1: rides in wherever? 269 00:14:21,996 --> 00:14:22,276 Speaker 2: It was? 270 00:14:22,396 --> 00:14:25,196 Speaker 1: Chandler, Arizona yet at that time? Like, is that is 271 00:14:25,196 --> 00:14:26,516 Speaker 1: it out in the world yet when you leave? 272 00:14:26,556 --> 00:14:29,956 Speaker 2: Okay, No, we've been doing demos and we were doing development. 273 00:14:29,996 --> 00:14:32,036 Speaker 2: I think we had vehicles in Chandler at that time, 274 00:14:32,116 --> 00:14:35,796 Speaker 2: and we're starting to operate, but they were still with 275 00:14:35,916 --> 00:14:38,076 Speaker 2: a person behind the steering wheel, and we were still 276 00:14:38,436 --> 00:14:43,236 Speaker 2: we were building towards what ultimately became the initial channel launch. 277 00:14:44,436 --> 00:14:45,796 Speaker 1: Why'd you leave? 278 00:14:46,116 --> 00:14:49,436 Speaker 2: It was time? You know, I had been there. I 279 00:14:49,476 --> 00:14:54,076 Speaker 2: really valued and left the experience, but at some point 280 00:14:54,156 --> 00:14:56,396 Speaker 2: I kind of lost confidence that we were going to 281 00:14:56,396 --> 00:14:59,116 Speaker 2: make the decisions that we needed to make to be successful. 282 00:14:59,156 --> 00:15:03,756 Speaker 2: And as a person leading the organization that you know, 283 00:15:03,996 --> 00:15:05,956 Speaker 2: Google had been incredibly generous to me, give me this 284 00:15:06,036 --> 00:15:09,156 Speaker 2: incredible opportunity, and if I just didn't believe it, it 285 00:15:09,236 --> 00:15:11,636 Speaker 2: wasn't my place to lead the team. And and so 286 00:15:12,356 --> 00:15:15,156 Speaker 2: you know, I'm a big believer that in business, you 287 00:15:15,156 --> 00:15:17,716 Speaker 2: you know, kind of you have three options. You if 288 00:15:17,716 --> 00:15:19,276 Speaker 2: you see something you don't like, you try and fix it. 289 00:15:19,516 --> 00:15:21,916 Speaker 2: If you can't fix it, you get in line. And 290 00:15:22,156 --> 00:15:23,436 Speaker 2: if you can't get in line, you get out of 291 00:15:23,476 --> 00:15:26,436 Speaker 2: the way, and you know, I did what I could 292 00:15:26,476 --> 00:15:28,396 Speaker 2: to try and you know, kind of move things away 293 00:15:28,396 --> 00:15:30,796 Speaker 2: that I thought they needed to go, and ultimately said, 294 00:15:30,876 --> 00:15:33,756 Speaker 2: you know, this company's been incredibly galous to me. Let 295 00:15:33,756 --> 00:15:36,196 Speaker 2: me get out of the way and let it gets 296 00:15:36,276 --> 00:15:37,116 Speaker 2: on its own path. 297 00:15:38,236 --> 00:15:41,996 Speaker 1: When you say you lost confidence that they were going 298 00:15:42,036 --> 00:15:45,476 Speaker 1: to make the decisions they needed to make, I mean, 299 00:15:46,196 --> 00:15:49,636 Speaker 1: obviously we should evaluate decisions based on the information available 300 00:15:49,636 --> 00:15:52,796 Speaker 1: at the time, but it is the case that you know, 301 00:15:53,356 --> 00:15:55,236 Speaker 1: you can get in Awai Mo in San Francisco in 302 00:15:55,316 --> 00:15:57,796 Speaker 1: one minute and have a beautiful ride right now, Like, 303 00:15:59,196 --> 00:15:59,836 Speaker 1: were you wrong? 304 00:16:00,476 --> 00:16:05,476 Speaker 2: I apparently yes, right, Like you know, like you think, 305 00:16:05,476 --> 00:16:07,436 Speaker 2: there's no way to judge it other than they they've 306 00:16:07,516 --> 00:16:12,596 Speaker 2: ultimately been successful. You know, maybe would have taken a different, 307 00:16:12,596 --> 00:16:16,076 Speaker 2: shorter path, but you know, I think I think it's 308 00:16:16,116 --> 00:16:18,396 Speaker 2: it's yeah. I don't know if you're a parent, but 309 00:16:18,396 --> 00:16:19,836 Speaker 2: one of the things you hope as a parent is 310 00:16:19,876 --> 00:16:25,316 Speaker 2: that you you you embody your kids with things and 311 00:16:25,316 --> 00:16:27,836 Speaker 2: then they go off from the world and succeed. And 312 00:16:27,876 --> 00:16:30,196 Speaker 2: the same is true when you lead teams or build something, 313 00:16:30,316 --> 00:16:31,956 Speaker 2: is that you know you do your part and you 314 00:16:31,996 --> 00:16:34,876 Speaker 2: hope it outlives you and that you know, outperforms what 315 00:16:34,916 --> 00:16:37,716 Speaker 2: you might have hoped for it. And I think it's 316 00:16:37,716 --> 00:16:39,556 Speaker 2: been awesome to see way most succeed. 317 00:16:40,236 --> 00:16:46,396 Speaker 1: So you started in twenty seventeen, right, uh, And as 318 00:16:46,436 --> 00:16:49,116 Speaker 1: I understand it, you didn't start focused on trucks, Like, 319 00:16:49,156 --> 00:16:51,276 Speaker 1: tell me about what you're thinking when you start the company? 320 00:16:51,716 --> 00:16:54,316 Speaker 2: Yeah, how you get to trucks? So you know, I 321 00:16:54,356 --> 00:16:57,316 Speaker 2: didn't leave Google to start the company. I left Goog 322 00:16:57,316 --> 00:16:58,796 Speaker 2: because it was time, and I then spent some time 323 00:16:58,796 --> 00:17:00,436 Speaker 2: trying to figure out what to do and kind of 324 00:17:00,436 --> 00:17:03,196 Speaker 2: came to the conclusion that, you know, it was worth 325 00:17:03,316 --> 00:17:05,196 Speaker 2: taking another shot at this and trying to build something. 326 00:17:06,436 --> 00:17:08,996 Speaker 2: And so found two great co founders in Sterling and True, 327 00:17:09,316 --> 00:17:12,156 Speaker 2: and we wanted to build a driver and we wanted 328 00:17:12,156 --> 00:17:15,316 Speaker 2: to apply that driver and passenger cars and trucks and 329 00:17:15,316 --> 00:17:17,956 Speaker 2: wherever it could go. And we thought at the time, 330 00:17:18,396 --> 00:17:20,236 Speaker 2: we didn't see how it's going to be possible to 331 00:17:20,316 --> 00:17:23,716 Speaker 2: do trucking because based on the experience I had and 332 00:17:23,756 --> 00:17:25,316 Speaker 2: others had, we looked at how far you had to 333 00:17:25,316 --> 00:17:27,556 Speaker 2: see down the road to do that safely, and we 334 00:17:27,596 --> 00:17:29,996 Speaker 2: came to the conclusion you just couldn't do that with 335 00:17:30,036 --> 00:17:32,436 Speaker 2: the technology that exists all the time, and you have. 336 00:17:32,436 --> 00:17:34,436 Speaker 1: To see farther just because it takes longer for a 337 00:17:34,436 --> 00:17:35,356 Speaker 1: big truck to stop. 338 00:17:35,396 --> 00:17:38,836 Speaker 2: This is physics. There's just way more kinetic energy. Right, 339 00:17:40,276 --> 00:17:42,396 Speaker 2: They're heavier, and they're moving at seventy miles an hour 340 00:17:42,436 --> 00:17:44,316 Speaker 2: instead of in a city you're moving at fifteen miles 341 00:17:44,316 --> 00:17:46,836 Speaker 2: an hour. So the combination of the speed and the 342 00:17:46,876 --> 00:17:49,516 Speaker 2: way it means that it just takes longer to stop 343 00:17:49,596 --> 00:17:54,276 Speaker 2: farther distance down the road. And then you add to that, 344 00:17:54,636 --> 00:17:56,596 Speaker 2: it's much easier drive around a light vehicle than a 345 00:17:56,596 --> 00:18:00,236 Speaker 2: big truck, so you don't need a special license. They're 346 00:18:00,356 --> 00:18:04,436 Speaker 2: much smaller, you don't need special thing maintenance programs and whatnot. 347 00:18:04,476 --> 00:18:07,476 Speaker 2: And so we're like, okay, let's focus on light vehicles. 348 00:18:07,676 --> 00:18:10,596 Speaker 2: We see there's a real opportunity there. If we can 349 00:18:10,756 --> 00:18:16,036 Speaker 2: crack the can you see far enough problem, then trucking 350 00:18:16,076 --> 00:18:18,716 Speaker 2: to be a great application to go to. Yeah. Yeah uh. 351 00:18:18,876 --> 00:18:20,516 Speaker 2: And so part of what I spent my time doing 352 00:18:20,596 --> 00:18:22,876 Speaker 2: during the first couple of years of the company was looking, 353 00:18:22,916 --> 00:18:25,316 Speaker 2: you know, basically turning over every rock we could with 354 00:18:25,316 --> 00:18:28,236 Speaker 2: with Bart, one of our early team members, to find 355 00:18:28,236 --> 00:18:30,596 Speaker 2: a technology that would allow us to see for far 356 00:18:30,676 --> 00:18:35,196 Speaker 2: enough that we could actually do trucking. And ultimately we 357 00:18:35,236 --> 00:18:38,876 Speaker 2: found that in this little company in Montana called Blackmore 358 00:18:39,796 --> 00:18:42,356 Speaker 2: that had built this really special kind of laser range 359 00:18:42,356 --> 00:18:45,796 Speaker 2: finder that because of the way it did measurements and 360 00:18:46,036 --> 00:18:53,356 Speaker 2: what's called frequency modulated continuous wave it it could see 361 00:18:53,356 --> 00:18:55,596 Speaker 2: basically twice as far as the rest of the light 362 00:18:55,636 --> 00:18:59,116 Speaker 2: our stuff could see. Uh, And so we're like, huh, 363 00:18:59,236 --> 00:19:02,916 Speaker 2: we now we we have some really interesting driving capability. 364 00:19:02,916 --> 00:19:06,796 Speaker 2: We've now got this pseudomagical light our sensor that can 365 00:19:06,836 --> 00:19:10,396 Speaker 2: see far enough trucking for like a really great application 366 00:19:10,476 --> 00:19:11,116 Speaker 2: to go take on. 367 00:19:12,156 --> 00:19:13,436 Speaker 1: Let's talk about a few of the things you had 368 00:19:13,436 --> 00:19:15,516 Speaker 1: to figure out. So you decide you're going to do trucking. Yeah, 369 00:19:15,556 --> 00:19:19,116 Speaker 1: and presumably by trucking, Like I mean, I know where 370 00:19:19,156 --> 00:19:22,556 Speaker 1: you are now, Like, so presumably by trucking. You don't 371 00:19:22,596 --> 00:19:26,276 Speaker 1: mean like the bread truck that goes to the little 372 00:19:26,396 --> 00:19:28,516 Speaker 1: market by my house. You mean the great, big truck 373 00:19:28,556 --> 00:19:29,676 Speaker 1: that goes on the freeway. 374 00:19:30,076 --> 00:19:33,316 Speaker 2: Yeah, that's where we thought there was the biggest opportunity 375 00:19:33,636 --> 00:19:36,876 Speaker 2: was working what we call class eight tractor trailers, big 376 00:19:36,996 --> 00:19:40,196 Speaker 2: you know, semi trucks. You know, they drive a lot 377 00:19:40,196 --> 00:19:43,156 Speaker 2: of miles. We don't have enough people who want to 378 00:19:43,196 --> 00:19:47,796 Speaker 2: do that job, and yet it's absolutely essential to American 379 00:19:47,956 --> 00:19:52,356 Speaker 2: and worldwide way of life, and there's a real need. 380 00:19:53,156 --> 00:19:55,436 Speaker 2: Right as we've dug into it, we realized that there's 381 00:19:55,756 --> 00:19:58,316 Speaker 2: five thousand people killed every year in heavy truck accidents, 382 00:19:58,556 --> 00:20:01,316 Speaker 2: five hundred thousand people injured. And then we can move 383 00:20:01,356 --> 00:20:05,436 Speaker 2: goods more efficiently, which you know is important, particularly in 384 00:20:05,676 --> 00:20:07,956 Speaker 2: the you know, this age of e commerce where people 385 00:20:07,956 --> 00:20:11,676 Speaker 2: expect things next day or you know, be able to 386 00:20:11,996 --> 00:20:15,356 Speaker 2: move those goods efficiently the customers is really valuable. 387 00:20:17,556 --> 00:20:21,556 Speaker 1: Okay, this April, if I've got my dates right, you 388 00:20:21,676 --> 00:20:26,196 Speaker 1: actually have a truck drive for real stuff on the 389 00:20:26,236 --> 00:20:28,676 Speaker 1: freeway without someone behind the wheel, right. 390 00:20:28,876 --> 00:20:31,876 Speaker 2: That's exactly right. It was awesome, right. We've as a 391 00:20:31,916 --> 00:20:34,756 Speaker 2: company been building towards that for almost eight and a 392 00:20:34,756 --> 00:20:38,196 Speaker 2: half years, eight and fourty years, and you know, we 393 00:20:38,356 --> 00:20:42,396 Speaker 2: take doing these things safely, very seriously, and really a 394 00:20:42,436 --> 00:20:44,596 Speaker 2: lot of the last couple of years has been making 395 00:20:44,596 --> 00:20:48,036 Speaker 2: sure we could have confidence that when we let the 396 00:20:48,036 --> 00:20:50,996 Speaker 2: thing go, that it was actually going to be safe. 397 00:20:51,036 --> 00:20:52,476 Speaker 2: And then we got to a point where we're like, yeah, 398 00:20:52,556 --> 00:20:54,556 Speaker 2: this is safe. We can go send this out in 399 00:20:54,596 --> 00:20:57,956 Speaker 2: the daylight and dry weather and have it drive back 400 00:20:57,956 --> 00:21:00,556 Speaker 2: and forth between Dallas and Houston. And I had the 401 00:21:00,596 --> 00:21:05,876 Speaker 2: privilege of right along in the back seat. And it 402 00:21:05,996 --> 00:21:10,116 Speaker 2: was awesome right in that it worked. Incredibly boring, right, 403 00:21:10,156 --> 00:21:11,196 Speaker 2: It turns out it's like it's. 404 00:21:11,356 --> 00:21:14,716 Speaker 1: Wanted to be so boring. You wanted to be very boring. 405 00:21:14,716 --> 00:21:18,396 Speaker 2: But but it just kind of worked. And I went 406 00:21:18,476 --> 00:21:21,476 Speaker 2: down and back that day in the truck and were. 407 00:21:21,276 --> 00:21:23,516 Speaker 1: You hauling stuff at that point? Was it was it? 408 00:21:23,596 --> 00:21:25,396 Speaker 2: We were? We were all the histories. 409 00:21:25,796 --> 00:21:31,796 Speaker 1: Yeah, oh pastries, Okay, pastries both ways, that would be 410 00:21:31,836 --> 00:21:34,236 Speaker 1: a bit. There's some pastries. It just that they want 411 00:21:34,276 --> 00:21:34,356 Speaker 1: to know. 412 00:21:34,996 --> 00:21:37,316 Speaker 2: I think there is a pastry imbalance. I think we 413 00:21:37,596 --> 00:21:39,516 Speaker 2: for the first trip, I think we may have actually 414 00:21:39,716 --> 00:21:41,996 Speaker 2: been empty. But on the way back, I'm pretty sure 415 00:21:41,996 --> 00:21:47,316 Speaker 2: from for whatever reasons, the pastries diffused from Houston to Dallas. 416 00:21:47,516 --> 00:21:50,636 Speaker 1: So then what happens in May and may you put 417 00:21:50,636 --> 00:21:53,636 Speaker 1: a person back behind the wheel, right? Tell me about that. 418 00:21:54,076 --> 00:21:56,836 Speaker 2: Yeah. We so as a company, we've really focused on 419 00:21:57,196 --> 00:21:59,116 Speaker 2: do what we do best, and what we do best 420 00:21:59,196 --> 00:22:02,556 Speaker 2: is building a safe, capable driving system. The oral driver 421 00:22:03,116 --> 00:22:06,396 Speaker 2: and we worked with other companies like Peterbilt at Pacar 422 00:22:06,596 --> 00:22:11,956 Speaker 2: and Volvo Trucks, and our partners at peter Bolt called 423 00:22:12,036 --> 00:22:14,956 Speaker 2: up and said, hey, we'd like you to have an 424 00:22:14,996 --> 00:22:17,796 Speaker 2: observer in the seat because there's some prototype parts in 425 00:22:18,476 --> 00:22:21,516 Speaker 2: our truck and it matters to us that you put 426 00:22:21,516 --> 00:22:25,356 Speaker 2: an observer in. And so we said, okay, uh, you know, 427 00:22:25,396 --> 00:22:28,236 Speaker 2: we had a conversation. We respect them, and we ultimately 428 00:22:28,236 --> 00:22:30,316 Speaker 2: put the observer in the seat, and we have a 429 00:22:30,356 --> 00:22:33,556 Speaker 2: person sat on board who is sat in the driver's seat, 430 00:22:33,916 --> 00:22:36,476 Speaker 2: but they're an observer and they're really just there because 431 00:22:36,516 --> 00:22:37,916 Speaker 2: our partner asked them to put them there. 432 00:22:38,356 --> 00:22:41,476 Speaker 1: The VIHA presumably they are a license to drive an 433 00:22:41,476 --> 00:22:43,836 Speaker 1: eighteen year It's not just some n week watching them. 434 00:22:43,876 --> 00:22:45,756 Speaker 1: What's going on. That's the truck driver. 435 00:22:45,636 --> 00:22:47,596 Speaker 2: You know. We we we don't want to put anyone 436 00:22:47,596 --> 00:22:49,396 Speaker 2: in place where you know, the high control has to 437 00:22:49,556 --> 00:22:51,596 Speaker 2: debate whether you know, it's just like, okay, let's just 438 00:22:51,636 --> 00:22:54,036 Speaker 2: put a truck driver there. They sit there. If you 439 00:22:54,076 --> 00:22:58,876 Speaker 2: go to YouTube dot com slash or row driver, you 440 00:22:58,876 --> 00:23:01,036 Speaker 2: can see a live video between eight and five o'clock 441 00:23:01,116 --> 00:23:04,316 Speaker 2: every day of our trucks drive down the road and 442 00:23:04,356 --> 00:23:07,196 Speaker 2: you can see that you know, this person sat there 443 00:23:07,276 --> 00:23:10,236 Speaker 2: and sometimes they're twiddling their sums, sometimes they're eating Fredo's, 444 00:23:11,436 --> 00:23:14,116 Speaker 2: you know, just observing. 445 00:23:14,516 --> 00:23:19,076 Speaker 1: I mean, at least from a narrative standpoint, it's a 446 00:23:19,076 --> 00:23:21,316 Speaker 1: bummer to have to put the person back behind the wheel. 447 00:23:21,356 --> 00:23:24,836 Speaker 2: Surely, absolutely, from a narrative standpoint, it's a bummer, right, 448 00:23:24,916 --> 00:23:28,636 Speaker 2: And it creates a nuance that is, you know, certainly 449 00:23:28,636 --> 00:23:29,836 Speaker 2: in today's. 450 00:23:29,436 --> 00:23:30,996 Speaker 1: You don't want a person buy the wheel. It's the 451 00:23:31,036 --> 00:23:33,116 Speaker 1: whole point. It's the whole point. It's you don't want 452 00:23:33,116 --> 00:23:34,356 Speaker 1: a person buy the wheel, right. 453 00:23:34,476 --> 00:23:38,076 Speaker 2: Totally agree. But in terms of the development and delivery 454 00:23:38,076 --> 00:23:40,556 Speaker 2: of what we're building, it just doesn't matter, right, it's 455 00:23:40,556 --> 00:23:41,876 Speaker 2: a complete nothing. 456 00:23:41,956 --> 00:23:44,236 Speaker 1: Well, I mean that's some margin. It matters at some point, 457 00:23:44,316 --> 00:23:46,276 Speaker 1: it matters, right. The whole premise is you can do 458 00:23:46,316 --> 00:23:49,476 Speaker 1: it without a person there, and so like at some point, 459 00:23:49,756 --> 00:23:52,996 Speaker 1: like relatively soon, I would imagine you need to do 460 00:23:53,036 --> 00:23:53,996 Speaker 1: it without a person there. 461 00:23:54,356 --> 00:23:58,316 Speaker 2: That's right, and uh, look forward to doing that soon. 462 00:24:00,956 --> 00:24:14,516 Speaker 3: We'll be back in just a minute. 463 00:24:14,836 --> 00:24:16,996 Speaker 1: Yeah, So where are you now, Like, give me where 464 00:24:16,996 --> 00:24:20,916 Speaker 1: we're basically to the present now after our twenty year journey, 465 00:24:21,156 --> 00:24:24,796 Speaker 1: Like what's like, are you are people paying you to 466 00:24:24,956 --> 00:24:25,556 Speaker 1: haul stuff. 467 00:24:25,916 --> 00:24:29,996 Speaker 2: Yeah, absolutely, So to like today is an incredibly exciting moment. 468 00:24:30,596 --> 00:24:32,796 Speaker 2: So we have a technology that can drive a truck. 469 00:24:32,876 --> 00:24:36,036 Speaker 2: It has the skills necessary to drive on the freeway 470 00:24:36,116 --> 00:24:40,436 Speaker 2: and drive to sites off the freeway in these industrial 471 00:24:40,476 --> 00:24:44,076 Speaker 2: park areas, and it does that safely. And then internally 472 00:24:44,116 --> 00:24:47,156 Speaker 2: we've got the capability say, okay, what are the new 473 00:24:47,196 --> 00:24:50,876 Speaker 2: features we need to make this more useful for our customers. 474 00:24:51,396 --> 00:24:55,516 Speaker 2: And so we launched in April. About three months later 475 00:24:55,956 --> 00:24:57,916 Speaker 2: we were able to show, you know, be able to 476 00:24:57,996 --> 00:25:00,756 Speaker 2: launch the next version that now doesn't just operate day 477 00:25:00,836 --> 00:25:03,956 Speaker 2: in the daytime, but operates at night as well. And 478 00:25:04,036 --> 00:25:06,156 Speaker 2: of the course the rest of this year, it's really 479 00:25:06,196 --> 00:25:09,596 Speaker 2: about taking and building the you know, in enhancing the 480 00:25:09,596 --> 00:25:11,716 Speaker 2: skills that it has to add the few new skills 481 00:25:11,756 --> 00:25:14,236 Speaker 2: it needs to be able to drive between Fort Worth 482 00:25:14,436 --> 00:25:17,796 Speaker 2: and Phoenix and El Paso. And so at that point, just. 483 00:25:17,756 --> 00:25:20,276 Speaker 1: To be clear, the route you have now is Houston Dallas, 484 00:25:20,356 --> 00:25:20,716 Speaker 1: Is that right? 485 00:25:20,716 --> 00:25:23,156 Speaker 2: That's right? I'm sorry, We drive tween thousand Houston and 486 00:25:23,196 --> 00:25:24,596 Speaker 2: by the end of the year we expect to be 487 00:25:24,676 --> 00:25:28,956 Speaker 2: driving between Fort Worth and Phoenix. And what's exciting about 488 00:25:29,036 --> 00:25:31,876 Speaker 2: and fort Worth and El Paso and El Paso and Phoenix. 489 00:25:33,156 --> 00:25:37,036 Speaker 2: And what's exciting about that is fort worth of Phoenix 490 00:25:37,116 --> 00:25:40,716 Speaker 2: is a thousand miles and a person is not able 491 00:25:40,756 --> 00:25:45,236 Speaker 2: to legally drive that in a day, And so we'll 492 00:25:45,236 --> 00:25:48,956 Speaker 2: be able to start doing things that are superhuman in 493 00:25:49,156 --> 00:25:50,236 Speaker 2: more and more dimensions. 494 00:25:50,636 --> 00:25:54,276 Speaker 1: Presumably the value preposition is really long drives, right, Like 495 00:25:55,316 --> 00:25:58,276 Speaker 1: it makes absolutely no sense for whatever one hundred miles 496 00:25:58,356 --> 00:26:00,636 Speaker 1: or something. And the longer it gets, the more the 497 00:26:00,676 --> 00:26:01,636 Speaker 1: economics makes sense. 498 00:26:01,716 --> 00:26:04,276 Speaker 2: Is that right? I mean, that's think it makes sense 499 00:26:04,316 --> 00:26:07,236 Speaker 2: across all different kinds of scales smiles. But I think 500 00:26:07,276 --> 00:26:09,876 Speaker 2: we're the biggest impact we'll ultimately is on these long 501 00:26:09,876 --> 00:26:10,476 Speaker 2: call trips. 502 00:26:11,156 --> 00:26:17,796 Speaker 1: And like, I get the freeway part in Broadway, How 503 00:26:17,796 --> 00:26:20,676 Speaker 1: does the not on the freeway part work? How constrained 504 00:26:20,716 --> 00:26:22,596 Speaker 1: are you as to where you can go when you're 505 00:26:22,636 --> 00:26:23,516 Speaker 1: not on the freeway. 506 00:26:23,796 --> 00:26:26,436 Speaker 2: Yeah, So the capability we're building is to be able 507 00:26:26,476 --> 00:26:32,076 Speaker 2: to drive everywhere. Sure, but like now, so today we 508 00:26:32,716 --> 00:26:36,596 Speaker 2: drive from a place that's off the freeway onto the freeway, 509 00:26:37,036 --> 00:26:39,196 Speaker 2: and then when we get in at the Dallas end 510 00:26:39,196 --> 00:26:41,276 Speaker 2: of the route, and then at the Houston end, we 511 00:26:41,356 --> 00:26:44,156 Speaker 2: drive about five miles from where we get off the 512 00:26:44,196 --> 00:26:48,076 Speaker 2: freeway to our terminal side, and along the way we 513 00:26:48,236 --> 00:26:51,756 Speaker 2: drive past customer locations. So we're making the choice today 514 00:26:51,756 --> 00:26:57,956 Speaker 2: to go to our terminals, but it's relatively straightforward to 515 00:26:57,996 --> 00:26:59,836 Speaker 2: stop and go to a different terminal instead or a 516 00:26:59,876 --> 00:27:00,636 Speaker 2: different See. 517 00:27:01,596 --> 00:27:05,836 Speaker 1: Like, how hard is that off the freeway part, the 518 00:27:05,916 --> 00:27:10,076 Speaker 1: sort of marginal you know, mile when not on the freeway? 519 00:27:11,196 --> 00:27:14,836 Speaker 2: Not, right, would be the short answer. Right, it's you know, 520 00:27:16,356 --> 00:27:18,676 Speaker 2: we've put a lot of for for a lot of time, 521 00:27:18,756 --> 00:27:21,636 Speaker 2: the primary focus what we're building was the freeway part, because, 522 00:27:22,236 --> 00:27:26,156 Speaker 2: as you pointed out, that's where the most kinetic energy is. 523 00:27:26,196 --> 00:27:30,756 Speaker 2: That's where you know, that's where the bad things could happen. Yeah. 524 00:27:30,836 --> 00:27:33,956 Speaker 1: I mean it's also simpler in certain ways, right, Like 525 00:27:34,196 --> 00:27:36,996 Speaker 1: it seems like turning a big truck around or just 526 00:27:37,076 --> 00:27:38,796 Speaker 1: getting a truck around. Like I live in New York 527 00:27:38,836 --> 00:27:41,196 Speaker 1: City and I see eighteen wheelers in New York City, 528 00:27:41,676 --> 00:27:44,836 Speaker 1: and I don't know, maybe that's maybe that's not that 529 00:27:44,916 --> 00:27:47,716 Speaker 1: hard to engineer for autonomy. You tell me it seems hard. 530 00:27:47,716 --> 00:27:48,476 Speaker 1: It looks hard. 531 00:27:48,676 --> 00:27:51,716 Speaker 2: Yeah, I think New York City is hard, right, And 532 00:27:51,756 --> 00:27:53,436 Speaker 2: I think there'll be a time where we get to 533 00:27:53,476 --> 00:27:56,676 Speaker 2: that where I'm really focused and we're really focused on 534 00:27:56,756 --> 00:27:58,756 Speaker 2: in the near term is how do we get to 535 00:27:58,756 --> 00:28:02,116 Speaker 2: the place where most freight moves right, And a lot 536 00:28:02,116 --> 00:28:06,836 Speaker 2: of that will go to these industrial parks in regions 537 00:28:06,836 --> 00:28:10,516 Speaker 2: of a city. And the reason why it's hard to 538 00:28:10,596 --> 00:28:13,796 Speaker 2: drive a truck in New York City right, and it's slow. 539 00:28:14,236 --> 00:28:16,476 Speaker 2: And so if you're trying to build a distribution center 540 00:28:16,516 --> 00:28:18,796 Speaker 2: for these Class eight tractors and trailers, you put them 541 00:28:18,796 --> 00:28:21,676 Speaker 2: in places where it's convenient because you're trying to run 542 00:28:21,716 --> 00:28:24,596 Speaker 2: a business, not build a technology showcase. 543 00:28:25,316 --> 00:28:29,996 Speaker 1: So what can you not do now ya? 544 00:28:30,996 --> 00:28:34,756 Speaker 2: So today we can't drive in the rain. Well, in fact, 545 00:28:34,836 --> 00:28:36,876 Speaker 2: that's not actually true. So today we drive in the 546 00:28:36,956 --> 00:28:42,116 Speaker 2: rain really well, but we haven't validated that fully for 547 00:28:42,236 --> 00:28:45,196 Speaker 2: driverless operations. And this is really an important idea, right, 548 00:28:45,276 --> 00:28:47,676 Speaker 2: is if you came down to Texas and got a 549 00:28:47,756 --> 00:28:49,516 Speaker 2: ride in one of our trucks and it was raining out, 550 00:28:50,516 --> 00:28:54,436 Speaker 2: it would work, and you would go, huh, why if 551 00:28:54,436 --> 00:28:56,756 Speaker 2: you guys not launched this, and we would say, yeah, 552 00:28:56,916 --> 00:29:00,636 Speaker 2: it basically works. But that's not good enough. It's not 553 00:29:00,676 --> 00:29:02,596 Speaker 2: good enough that it feels like it works. What we 554 00:29:02,676 --> 00:29:03,516 Speaker 2: want to have done is. 555 00:29:03,476 --> 00:29:07,036 Speaker 1: All basically it makes me nervous with a ten roil truck. 556 00:29:07,076 --> 00:29:09,996 Speaker 2: Exactly and it should right, And so we put a 557 00:29:10,036 --> 00:29:13,356 Speaker 2: lot of effort into making sure that it doesn't just 558 00:29:13,436 --> 00:29:15,956 Speaker 2: seem like it works, but we have conviction that it 559 00:29:15,996 --> 00:29:21,036 Speaker 2: will work right. And you know, as an example, two 560 00:29:21,076 --> 00:29:24,956 Speaker 2: of the last things that we we got checked off 561 00:29:24,996 --> 00:29:30,916 Speaker 2: when we launch, where it's one really high speed motorbikes 562 00:29:31,396 --> 00:29:34,996 Speaker 2: running red lights in the city. So we had done 563 00:29:35,036 --> 00:29:37,956 Speaker 2: a bunch of testing, I think up to let me say, 564 00:29:38,116 --> 00:29:39,796 Speaker 2: I'm gonna say some numbers here. These are probably not 565 00:29:39,836 --> 00:29:42,996 Speaker 2: quite right, but I'll use some numbers, you know, at 566 00:29:43,076 --> 00:29:45,596 Speaker 2: ninety miles an hour, because we figure, you know, ninety 567 00:29:45,636 --> 00:29:48,636 Speaker 2: miles an hour in a surface road, nobody. 568 00:29:48,396 --> 00:29:50,516 Speaker 1: Need the motorcycles going ninety. 569 00:29:50,636 --> 00:29:53,116 Speaker 2: Motorcycles going ninety miles an hour and they're going to 570 00:29:53,196 --> 00:29:55,396 Speaker 2: run a red light in front of us, and we 571 00:29:55,436 --> 00:29:57,476 Speaker 2: want the truck to not hit that motorbike. And then 572 00:29:57,516 --> 00:29:59,516 Speaker 2: we're on the freeway and we see a motorbike doing 573 00:29:59,556 --> 00:30:03,076 Speaker 2: one hundred and fifty miles an hour, and we're like, okay, 574 00:30:03,116 --> 00:30:07,596 Speaker 2: that is let's go make sure that that works right, 575 00:30:07,636 --> 00:30:10,956 Speaker 2: because you know, at some point we just want to 576 00:30:10,996 --> 00:30:12,716 Speaker 2: have comps. And so we went through and we found 577 00:30:12,756 --> 00:30:17,316 Speaker 2: out that when motorbike's going that extremely fast. First, we 578 00:30:18,036 --> 00:30:20,636 Speaker 2: found that our simulation wasn't quite up to snuff to 579 00:30:20,676 --> 00:30:22,196 Speaker 2: deal with that, so we had to go fix that. 580 00:30:23,316 --> 00:30:26,916 Speaker 2: And then we found that our perception system wasn't able 581 00:30:26,996 --> 00:30:28,876 Speaker 2: to quite track it as well as we'd want, so 582 00:30:28,916 --> 00:30:31,356 Speaker 2: we had some things to fix there, and so ultimately 583 00:30:31,436 --> 00:30:33,276 Speaker 2: we got to the point where if you're a motorbike, 584 00:30:33,716 --> 00:30:36,716 Speaker 2: please don't go test this run in a red light 585 00:30:36,716 --> 00:30:38,836 Speaker 2: at one hundred and fifty miles an hour. We're going 586 00:30:38,916 --> 00:30:41,236 Speaker 2: to see you and assuming you know, we can actually 587 00:30:41,236 --> 00:30:42,836 Speaker 2: see you, because there's not a building in the way, 588 00:30:43,356 --> 00:30:46,316 Speaker 2: we're going to stop and you know, not have an 589 00:30:46,316 --> 00:30:46,876 Speaker 2: event with you. 590 00:30:47,396 --> 00:30:50,836 Speaker 1: So is the rain analog of that, like more rain 591 00:30:50,916 --> 00:30:54,316 Speaker 1: than I can imagine and flooding and it's dark or something. 592 00:30:54,396 --> 00:30:57,876 Speaker 1: I mean, yeah, maybe frozen frozen also. 593 00:30:58,156 --> 00:31:01,676 Speaker 2: And so this is this is the answers. We we 594 00:31:01,756 --> 00:31:03,996 Speaker 2: have some good ideas of what these challenges are, and 595 00:31:04,076 --> 00:31:06,036 Speaker 2: so we're in the process of putting place the last 596 00:31:06,036 --> 00:31:08,916 Speaker 2: of the tests that we need to say yeah, across 597 00:31:08,956 --> 00:31:12,556 Speaker 2: the things that might be a problem, we actually handle 598 00:31:12,596 --> 00:31:13,756 Speaker 2: those in a way that's safe. 599 00:31:14,156 --> 00:31:16,196 Speaker 1: Once you got rain, are you good? I mean, I 600 00:31:16,196 --> 00:31:20,756 Speaker 1: guess you have the whole southern US without snow and ice? Right, Well, 601 00:31:20,876 --> 00:31:23,476 Speaker 1: think it not freezes in Texas? Right? Does it freeze in? 602 00:31:23,796 --> 00:31:26,596 Speaker 2: Does we get we get snow in Texas? And but 603 00:31:27,036 --> 00:31:29,916 Speaker 2: the good news is, like the reason why a lot 604 00:31:29,956 --> 00:31:33,676 Speaker 2: of freight moves wrong the sun belt is because trucks 605 00:31:33,796 --> 00:31:36,916 Speaker 2: don't like driving in snow. Right, It's just, you know, 606 00:31:36,996 --> 00:31:39,036 Speaker 2: much like people in general don't like driving in snow. 607 00:31:39,236 --> 00:31:40,796 Speaker 2: But there is snow. But if you've ever been in 608 00:31:40,836 --> 00:31:43,036 Speaker 2: Texas when there's a snowstorm, it basically shuts down. 609 00:31:43,396 --> 00:31:46,076 Speaker 1: Yeah, so you don't you don't have to solve snow. 610 00:31:46,156 --> 00:31:48,036 Speaker 1: You don't have to solve snow for now. 611 00:31:48,076 --> 00:31:50,956 Speaker 2: For now. But we will solve snow, right because all 612 00:31:50,996 --> 00:31:52,916 Speaker 2: of the stuff that works for rain mostly works for snow. 613 00:31:52,956 --> 00:31:55,436 Speaker 1: To me, all of these problems we can stipulate will 614 00:31:55,476 --> 00:31:58,716 Speaker 1: be solved by somebody, right, Yeah, plausibly by you, if 615 00:31:58,716 --> 00:32:01,156 Speaker 1: not you, by somebody else. But like I feel like 616 00:32:01,196 --> 00:32:04,676 Speaker 1: there's some sense of urgency. Like I'll say so interestingly, 617 00:32:04,876 --> 00:32:09,996 Speaker 1: just purely coincidentally, I talked to uh Worris softman? 618 00:32:10,116 --> 00:32:10,916 Speaker 2: Is it softman? Yeah? 619 00:32:10,916 --> 00:32:13,676 Speaker 1: Have his name is Med and so he said, and 620 00:32:13,716 --> 00:32:16,076 Speaker 1: we were not talking about you. This was not a 621 00:32:16,116 --> 00:32:18,036 Speaker 1: dig on you. It was more talking about why he's 622 00:32:18,116 --> 00:32:21,636 Speaker 1: doing what he's doing. He said, I don't think Open 623 00:32:21,676 --> 00:32:25,596 Speaker 1: Road Autonomy is a startup game. Uh. And he basically 624 00:32:25,836 --> 00:32:28,156 Speaker 1: you probably know he thinks, I mean, he basically thinks 625 00:32:28,156 --> 00:32:29,236 Speaker 1: it's too expensive. 626 00:32:29,636 --> 00:32:29,876 Speaker 2: Yeah. 627 00:32:29,956 --> 00:32:32,796 Speaker 1: In short, too expensive and too hard in that way 628 00:32:33,436 --> 00:32:35,756 Speaker 1: for a company that is capital constrained, for a company 629 00:32:35,756 --> 00:32:37,796 Speaker 1: that you know doesn't have a monopoly on search. 630 00:32:37,956 --> 00:32:38,476 Speaker 2: Let's see. 631 00:32:39,156 --> 00:32:40,796 Speaker 1: And so I do feel like in talking to you 632 00:32:40,956 --> 00:32:43,996 Speaker 1: like I, you plausibly may solve them. But there must 633 00:32:44,036 --> 00:32:46,476 Speaker 1: be some clock for you, right, You must have to 634 00:32:46,516 --> 00:32:50,436 Speaker 1: get to a point where you're making money relatively soon. 635 00:32:50,556 --> 00:32:52,396 Speaker 2: Is that right? Yeah, Well we've been at the game 636 00:32:52,436 --> 00:32:54,996 Speaker 2: for a while. You know, we went into an eyes 637 00:32:55,036 --> 00:32:58,196 Speaker 2: open right and looked at the you know, having spent 638 00:32:58,236 --> 00:33:00,276 Speaker 2: the time with Waymo, having the history I have, I 639 00:33:00,396 --> 00:33:02,676 Speaker 2: was like, this is not one hundred million dollar problem 640 00:33:02,796 --> 00:33:04,476 Speaker 2: or a ten million dollar problem. This is a multi 641 00:33:04,476 --> 00:33:08,276 Speaker 2: billion dollar problem. And so as we've built Aurora, it's 642 00:33:08,316 --> 00:33:11,676 Speaker 2: been Okay, we need to set ourselves up with the staff, 643 00:33:11,676 --> 00:33:13,996 Speaker 2: we need to set ourselves up with the capital partners, 644 00:33:13,996 --> 00:33:17,916 Speaker 2: we need to set ourselves up with the technology partners 645 00:33:18,596 --> 00:33:20,676 Speaker 2: that allow us to take that run. And I think 646 00:33:20,716 --> 00:33:23,156 Speaker 2: we've been very careful about how we've done that, and 647 00:33:23,156 --> 00:33:25,116 Speaker 2: we've put ourselves in a position to succeed. And I 648 00:33:25,116 --> 00:33:28,356 Speaker 2: agree with with Boris this is you know, I hear 649 00:33:28,396 --> 00:33:30,476 Speaker 2: people throwing out, oh, we're gonna, you know, spend one 650 00:33:30,516 --> 00:33:32,676 Speaker 2: hundred million dollars to do this. I think, you know, 651 00:33:32,916 --> 00:33:38,756 Speaker 2: you haven't even met the ante, right, And so for us, 652 00:33:39,796 --> 00:33:41,796 Speaker 2: we look at say, driving in the rain, we expect 653 00:33:41,796 --> 00:33:44,956 Speaker 2: to be solving that by the end of this year, right, 654 00:33:44,996 --> 00:33:47,916 Speaker 2: And at that point, as you point out, we've basically 655 00:33:47,996 --> 00:33:51,556 Speaker 2: unlocked the Southern freight belt. The amount of truck travel 656 00:33:51,596 --> 00:33:56,796 Speaker 2: that's there is gigantic, and so you know, from now 657 00:33:56,796 --> 00:33:58,956 Speaker 2: where we have a small number of trucks in the road, 658 00:33:59,276 --> 00:34:01,276 Speaker 2: by the end of next year, end of twenty six, 659 00:34:01,356 --> 00:34:04,716 Speaker 2: we expect to have a few hundred trucks on the road, 660 00:34:05,076 --> 00:34:06,796 Speaker 2: and then the year after that we expect to have 661 00:34:06,836 --> 00:34:09,076 Speaker 2: one thousand plus trucks on the road, and then we'll 662 00:34:09,156 --> 00:34:14,116 Speaker 2: built from there. And so it's you know, the conviction 663 00:34:14,156 --> 00:34:16,276 Speaker 2: we're starting to build internally, like yeah, this is just 664 00:34:16,276 --> 00:34:17,956 Speaker 2: just turn over the kind of the way we want it, 665 00:34:17,956 --> 00:34:20,396 Speaker 2: and we expect it is happening now and so it's 666 00:34:20,396 --> 00:34:22,556 Speaker 2: not a hypothetical. It some day we'll solve it. It's like, no, 667 00:34:22,796 --> 00:34:27,316 Speaker 2: we're you know, four or five months kind of time frame. So, well, 668 00:34:27,356 --> 00:34:28,396 Speaker 2: three four months timeframe. 669 00:34:28,836 --> 00:34:29,996 Speaker 1: What's the business model? 670 00:34:31,156 --> 00:34:34,476 Speaker 2: Yeah, so our business model is to work with the 671 00:34:34,516 --> 00:34:37,556 Speaker 2: ecosystem that's there today. So today, if you're a truck 672 00:34:37,596 --> 00:34:39,476 Speaker 2: and company, you buy a truck and then you pay 673 00:34:39,476 --> 00:34:42,436 Speaker 2: somebody to drive that truck for you. Uh, and so 674 00:34:42,996 --> 00:34:45,116 Speaker 2: that's what our company's going to do. You'll if you're 675 00:34:45,996 --> 00:34:48,396 Speaker 2: a company and wants some Automas truck, you're gonna go 676 00:34:48,436 --> 00:34:50,716 Speaker 2: to pac r and peter Bilt, or you're going to 677 00:34:50,756 --> 00:34:52,276 Speaker 2: go in Evolve and say I want to buy a truck, 678 00:34:52,956 --> 00:34:56,996 Speaker 2: and they buy the truck stuff and then it'll roll 679 00:34:57,076 --> 00:34:59,516 Speaker 2: off the line from our partner and it'll have the 680 00:34:59,636 --> 00:35:01,996 Speaker 2: row driver installed on it and you'll pay a subscription 681 00:35:02,156 --> 00:35:04,556 Speaker 2: to Aurora to drive that truck for you. 682 00:35:04,996 --> 00:35:08,916 Speaker 1: I see. Okay, So so you guys get paid by 683 00:35:08,956 --> 00:35:12,036 Speaker 1: the by the mile or by the by the hall 684 00:35:12,196 --> 00:35:12,956 Speaker 1: or something like that. 685 00:35:13,316 --> 00:35:16,476 Speaker 2: Yeah, that's exactly right. You know, it's at the risk 686 00:35:16,516 --> 00:35:19,356 Speaker 2: of at a service as a services you know, it's 687 00:35:19,476 --> 00:35:21,956 Speaker 2: driver as a service. That's the way we were thinking 688 00:35:21,956 --> 00:35:22,636 Speaker 2: about the business. 689 00:35:23,356 --> 00:35:25,356 Speaker 1: And the trucking company is buying the truck, So that's 690 00:35:25,396 --> 00:35:26,876 Speaker 1: good for you. You don't have to When you say 691 00:35:26,916 --> 00:35:28,756 Speaker 1: you'll have a thousand trucks out there, you don't mean 692 00:35:28,796 --> 00:35:30,436 Speaker 1: you're going to own a thousand trucks. You mean you'll 693 00:35:30,476 --> 00:35:33,356 Speaker 1: have essentially a thousand autonomous drivers in trucks that other 694 00:35:33,396 --> 00:35:33,836 Speaker 1: people own. 695 00:35:34,036 --> 00:35:36,876 Speaker 2: That's right, And this is because that's what our customers 696 00:35:36,876 --> 00:35:39,876 Speaker 2: want right that there Also it. 697 00:35:39,916 --> 00:35:42,836 Speaker 1: Seems easier as a business, and I still not have 698 00:35:42,916 --> 00:35:44,596 Speaker 1: to get into the trucking business. 699 00:35:44,916 --> 00:35:47,556 Speaker 2: I agreed again back to the philosophy of we think 700 00:35:47,556 --> 00:35:50,796 Speaker 2: we're really good at building the driver. And I look 701 00:35:50,836 --> 00:35:54,396 Speaker 2: at our partners, whether it's Werner or Hirshbock or Federal Express, 702 00:35:55,276 --> 00:35:57,436 Speaker 2: they know what they're doing. Why don't we help them 703 00:35:57,436 --> 00:36:00,196 Speaker 2: make their business a bit better and work together? 704 00:36:03,676 --> 00:36:06,076 Speaker 1: Can we talk about autonomy a little bit more broadly, 705 00:36:06,196 --> 00:36:08,356 Speaker 1: just since you're at it so long and I'm curious 706 00:36:08,396 --> 00:36:12,516 Speaker 1: what you think, So, like, what do you think are 707 00:36:12,516 --> 00:36:17,396 Speaker 1: the important constraints on autonomous vehicles right now? Sort of 708 00:36:17,436 --> 00:36:21,116 Speaker 1: broadly technologically, you know, in terms of policy. 709 00:36:21,876 --> 00:36:25,716 Speaker 2: I think it's primarily technological right now. Okay, right, And 710 00:36:25,756 --> 00:36:28,196 Speaker 2: I've said that. You know, people for for the better 711 00:36:28,196 --> 00:36:30,676 Speaker 2: part of a decade have been you know, when when 712 00:36:30,676 --> 00:36:34,996 Speaker 2: it's convenient hiding behind you know, regulation or policy. And 713 00:36:35,556 --> 00:36:38,716 Speaker 2: you know, the US policy of regulatory environment is permissive. 714 00:36:38,836 --> 00:36:40,876 Speaker 2: It's one of the things that's allowed innovation to flourish 715 00:36:40,916 --> 00:36:43,956 Speaker 2: in this country, certainly in the automotive space. 716 00:36:44,516 --> 00:36:47,876 Speaker 1: So what are the what are the important technological constraints 717 00:36:47,916 --> 00:36:51,996 Speaker 1: right now? Like, I mean, obviously WEIMO is thriving where 718 00:36:52,036 --> 00:36:55,676 Speaker 1: it's living, but like, also there are no autonomous cars 719 00:36:55,676 --> 00:36:57,956 Speaker 1: where I live and people aren't buying them, and so 720 00:36:58,076 --> 00:37:01,196 Speaker 1: like what's the what's what are the important bottlenecks right now? 721 00:37:01,716 --> 00:37:06,676 Speaker 2: Uh? Execution madwidth? Right? Like, So so at least where 722 00:37:06,676 --> 00:37:10,276 Speaker 2: I sit at a roar today, I don't out and see, 723 00:37:10,436 --> 00:37:12,076 Speaker 2: Oh my gosh, I don't know how we're going to 724 00:37:12,116 --> 00:37:16,116 Speaker 2: solve that. It's hey, we have this many people, we 725 00:37:16,196 --> 00:37:18,436 Speaker 2: have this much leadership band with we just need to 726 00:37:18,436 --> 00:37:20,036 Speaker 2: go execute right. 727 00:37:19,916 --> 00:37:24,196 Speaker 1: Like thinking industry wide and not to be like petty 728 00:37:24,276 --> 00:37:26,756 Speaker 1: or something, but like why aren't autonomous cars everywhere? 729 00:37:26,836 --> 00:37:29,716 Speaker 2: Right? Yeah? Because it's really hard, right, And I think 730 00:37:29,796 --> 00:37:34,156 Speaker 2: the right metaphor for this is commercial aviation, like the 731 00:37:34,156 --> 00:37:36,516 Speaker 2: physics are really in grasp and like you can both 732 00:37:36,556 --> 00:37:40,636 Speaker 2: things together. And yet you know, we basically have two 733 00:37:42,756 --> 00:37:46,916 Speaker 2: commercial aviation manufacturers globally, right, we have Boeing and Airbus. 734 00:37:47,076 --> 00:37:49,956 Speaker 1: Because it's hard to build big jet aircraft. 735 00:37:50,636 --> 00:37:53,156 Speaker 2: It's hard to build them, it's hard to have conviction 736 00:37:53,196 --> 00:37:56,396 Speaker 2: they're going to be safe, it's hard to produce them right. 737 00:37:56,436 --> 00:37:58,396 Speaker 2: And when I look at automated driving, I think it's 738 00:37:58,396 --> 00:38:01,356 Speaker 2: in a very similar space in that a lot of 739 00:38:01,396 --> 00:38:03,676 Speaker 2: things have to work well. You have to have the 740 00:38:03,756 --> 00:38:05,996 Speaker 2: discipline to be able to work across all of them, 741 00:38:06,076 --> 00:38:08,836 Speaker 2: make sure they all fit together, and then you have 742 00:38:08,876 --> 00:38:12,236 Speaker 2: to have the processes. And process in Silicon Valley is 743 00:38:12,276 --> 00:38:14,076 Speaker 2: often kind of a four letter word, like you know, 744 00:38:14,076 --> 00:38:16,356 Speaker 2: you've got to move past and break things, but when 745 00:38:16,396 --> 00:38:19,916 Speaker 2: you're building something safety critical, you actually have to get 746 00:38:19,956 --> 00:38:22,836 Speaker 2: that stuff right, and the art is getting it right, 747 00:38:22,956 --> 00:38:24,556 Speaker 2: we'll also be able to do it quickly. 748 00:38:25,556 --> 00:38:30,396 Speaker 1: So so when you think about the spread of autonomy 749 00:38:30,436 --> 00:38:33,876 Speaker 1: from here, again not just within your company, but sort 750 00:38:33,876 --> 00:38:37,156 Speaker 1: of more broadly, like how do you think it'll play out? 751 00:38:37,596 --> 00:38:40,756 Speaker 2: I think it's going to usher incredible new age all right, 752 00:38:40,836 --> 00:38:48,276 Speaker 2: that we should be able to dramatically reduce traffic fatalities. 753 00:38:48,316 --> 00:38:51,476 Speaker 2: The fact that you know, it had been a decade 754 00:38:51,516 --> 00:38:55,876 Speaker 2: without a fatality on civil aviation, and yet we had 755 00:38:55,876 --> 00:38:58,076 Speaker 2: forty thousand people killed on the road every year in 756 00:38:58,076 --> 00:39:01,276 Speaker 2: the US. Like, we can bring a technology to bear 757 00:39:01,316 --> 00:39:03,796 Speaker 2: to help solve this, right, and I think that's profound. 758 00:39:03,796 --> 00:39:06,756 Speaker 2: I think we can have a huge sustainability impact all 759 00:39:06,796 --> 00:39:12,116 Speaker 2: while actually kind of power up the US economy. And 760 00:39:12,196 --> 00:39:14,396 Speaker 2: so for us, you know, we've got this mission right 761 00:39:14,396 --> 00:39:17,556 Speaker 2: now that's focused on trucking, but what we're really building 762 00:39:17,636 --> 00:39:22,236 Speaker 2: is this capability to release safety critical software and systems 763 00:39:22,836 --> 00:39:25,636 Speaker 2: and this ability to understand the world. And so you 764 00:39:25,676 --> 00:39:27,596 Speaker 2: combine those two things, the places we can go with 765 00:39:27,596 --> 00:39:30,516 Speaker 2: that are profound, right, whether it is you know, last 766 00:39:30,556 --> 00:39:34,996 Speaker 2: mile delivery or ride hailing applications, or mining or farming 767 00:39:35,276 --> 00:39:37,436 Speaker 2: or aviation. Like there's it's just going to be a 768 00:39:37,436 --> 00:39:40,396 Speaker 2: lot of fun over the next decade. So yeah, I'm 769 00:39:40,396 --> 00:39:41,036 Speaker 2: psyched for it. 770 00:39:41,756 --> 00:39:45,836 Speaker 1: Next decade sounds fast given the pace so far. I 771 00:39:45,876 --> 00:39:48,476 Speaker 1: was surprised when you said next decade at the end there. 772 00:39:48,676 --> 00:39:52,356 Speaker 2: Yeah, I think that like anything, there's a point where 773 00:39:52,396 --> 00:39:57,916 Speaker 2: you have built a foundational base and then suddenly you're like, oh, 774 00:39:58,356 --> 00:40:00,316 Speaker 2: now I have this power. Now I have the superpower 775 00:40:00,356 --> 00:40:01,276 Speaker 2: to go do these things. 776 00:40:01,676 --> 00:40:04,556 Speaker 1: It's basically the package of hardware plus software at a 777 00:40:04,556 --> 00:40:06,956 Speaker 1: reductive level, and you can sort of put it on 778 00:40:07,036 --> 00:40:09,076 Speaker 1: other things and it'll more or less work. 779 00:40:09,356 --> 00:40:12,956 Speaker 2: More importantly the process and the data and again does 780 00:40:12,996 --> 00:40:14,476 Speaker 2: not sound sexy. 781 00:40:14,756 --> 00:40:17,356 Speaker 1: But the data sounds sexy, Like the more you talk 782 00:40:17,356 --> 00:40:20,076 Speaker 1: to machine learning people, the more sexy data sounds. Right. 783 00:40:20,276 --> 00:40:23,116 Speaker 2: Yeah, And you know, and we joke internally, you know, 784 00:40:23,156 --> 00:40:26,076 Speaker 2: if you if we have three kind of big artifacts, 785 00:40:26,116 --> 00:40:30,076 Speaker 2: being our software, our data, and our process, if you 786 00:40:30,076 --> 00:40:31,876 Speaker 2: held my gun in my head and said delete one 787 00:40:31,916 --> 00:40:37,116 Speaker 2: of them, I'd say delete the software, right, Please do 788 00:40:37,196 --> 00:40:38,316 Speaker 2: not delete our software. 789 00:40:40,116 --> 00:40:46,036 Speaker 1: Yeah, that's really interesting. Yeah, we'll be back in a 790 00:40:46,076 --> 00:40:59,236 Speaker 1: minute with the life around. We're gonna finish with the 791 00:40:59,316 --> 00:41:01,796 Speaker 1: lightning round. I appreciate your time, and we were almost done. 792 00:41:02,156 --> 00:41:03,596 Speaker 2: Cool. Thanks for the conversation. 793 00:41:05,756 --> 00:41:08,276 Speaker 1: So right that your father was a prison warden he was, 794 00:41:08,316 --> 00:41:11,916 Speaker 1: in fact, Yeah, did you learn any management and were 795 00:41:11,956 --> 00:41:13,196 Speaker 1: parenting tips from him? 796 00:41:14,636 --> 00:41:19,756 Speaker 2: Lock them up? Uh? No, absolutely not. You know. One 797 00:41:19,756 --> 00:41:22,796 Speaker 2: of the things that that stuck with me is, you know, 798 00:41:22,796 --> 00:41:25,396 Speaker 2: it's it's a difficult job being a prison warden because 799 00:41:26,076 --> 00:41:28,596 Speaker 2: you have the staff and then you have the inmates 800 00:41:29,276 --> 00:41:34,836 Speaker 2: and right, and and they're often at odds. And one 801 00:41:34,836 --> 00:41:36,236 Speaker 2: of the things he said was, you know, make sure 802 00:41:36,236 --> 00:41:38,996 Speaker 2: you treat everyone with respect. Right that he would, you know, 803 00:41:39,036 --> 00:41:42,196 Speaker 2: these are people who had made a mistake in life, 804 00:41:42,636 --> 00:41:46,196 Speaker 2: you know, and you treat them with respect. Uh. And 805 00:41:46,596 --> 00:41:48,756 Speaker 2: you know, I think that's really powerful and important. 806 00:41:49,916 --> 00:41:51,596 Speaker 1: What's one thing you learned working at Google? 807 00:41:53,076 --> 00:41:56,076 Speaker 2: I think big, right, I think I think that was, 808 00:41:58,396 --> 00:42:01,476 Speaker 2: you know, Larry and Sergey never lacked for vision, and 809 00:42:01,556 --> 00:42:04,756 Speaker 2: I think you can often find yourself constrained by kind 810 00:42:04,796 --> 00:42:07,116 Speaker 2: of your own kind of sense of what's a limit, 811 00:42:07,636 --> 00:42:10,316 Speaker 2: and they just thinking like, no, what, why can't you 812 00:42:10,396 --> 00:42:13,156 Speaker 2: do that? Why can't you do that? Yeah? 813 00:42:13,196 --> 00:42:16,716 Speaker 1: I remember, just as a consumer, just as a as 814 00:42:16,716 --> 00:42:20,556 Speaker 1: an ordinary person in the world, when they started what 815 00:42:20,716 --> 00:42:23,316 Speaker 1: became Google street View, I was like, yeah, surely you 816 00:42:23,356 --> 00:42:27,596 Speaker 1: can't take a picture of every bit of every road 817 00:42:27,716 --> 00:42:31,516 Speaker 1: in America, and actually you can and they did. 818 00:42:31,796 --> 00:42:34,476 Speaker 2: Yeah. Right. Like I think that that sense of like 819 00:42:34,676 --> 00:42:36,636 Speaker 2: you don't need to be intimidated by this, like you 820 00:42:36,676 --> 00:42:38,916 Speaker 2: need to think rationally, you need to figure it out. 821 00:42:39,396 --> 00:42:41,036 Speaker 2: But you know, I think it's a power one of 822 00:42:41,076 --> 00:42:45,876 Speaker 2: my professors at Carnegie Mellon said, dream like an amateur, 823 00:42:46,356 --> 00:42:49,436 Speaker 2: execute like professional, right, And I think that's just a 824 00:42:49,516 --> 00:42:51,036 Speaker 2: really kind of profound sentiment. 825 00:42:52,396 --> 00:42:54,156 Speaker 1: What's one thing your time at Google taught you not 826 00:42:54,276 --> 00:42:54,596 Speaker 1: to do? 827 00:42:55,236 --> 00:43:00,476 Speaker 2: Oh? What it teged me not to do? 828 00:43:02,996 --> 00:43:03,116 Speaker 1: Uh? 829 00:43:03,396 --> 00:43:06,116 Speaker 2: Maybe kind of adjacent that. Like, so when I went 830 00:43:06,156 --> 00:43:07,996 Speaker 2: to Google, I thought I was a pretty good programmer, 831 00:43:08,116 --> 00:43:10,596 Speaker 2: you know, I was, Yeah, Carnie Melon, I'm just good 832 00:43:10,596 --> 00:43:15,116 Speaker 2: at what I did, and I'm not a bad programmer. 833 00:43:15,596 --> 00:43:18,836 Speaker 2: But like, there are truly exceptional people out in the world, right, 834 00:43:18,876 --> 00:43:22,276 Speaker 2: And so I think, you know, maybe don't overestimate yourself 835 00:43:23,676 --> 00:43:27,196 Speaker 2: and kind of again kind of think, kind of think big, 836 00:43:27,236 --> 00:43:28,516 Speaker 2: don't underestimate what's out there. 837 00:43:34,476 --> 00:43:37,436 Speaker 1: Chris Sumson is the co founder and CEO of Aurora. 838 00:43:38,516 --> 00:43:41,836 Speaker 1: Please email us at problem at pushkin dot fm. We 839 00:43:41,916 --> 00:43:45,596 Speaker 1: are always looking for new guests for the show. Today's 840 00:43:45,636 --> 00:43:49,436 Speaker 1: show was produced by Trinamnino and Gabriel Hunter Chang. It 841 00:43:49,596 --> 00:43:53,436 Speaker 1: was edited by Alexander Garretson and engineered by Sarah Burgher. 842 00:43:53,836 --> 00:43:55,996 Speaker 1: I'm Jacob Goldstein and we'll be back next week with 843 00:43:56,036 --> 00:44:09,276 Speaker 1: another episode of What's Your Problem.