1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,640 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,680 --> 00:00:16,959 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,040 --> 00:00:19,279 Speaker 1: how Stuff Works in I heart radio and I love 5 00:00:19,360 --> 00:00:23,000 Speaker 1: all things tech. And in our last episode, if you 6 00:00:23,040 --> 00:00:24,720 Speaker 1: haven't heard that, you should probably go and listen to it. 7 00:00:24,760 --> 00:00:28,560 Speaker 1: But I left off with Ernest Dickmans, the German engineer 8 00:00:28,960 --> 00:00:32,879 Speaker 1: who had done amazing work with dynamic computer vision and 9 00:00:33,000 --> 00:00:36,760 Speaker 1: vehicle automation in the nineteen eighties and nineties. But that 10 00:00:36,840 --> 00:00:39,760 Speaker 1: work would only go as far as the sophistication of 11 00:00:39,800 --> 00:00:43,239 Speaker 1: the technology of the time would allow, and funding for 12 00:00:43,280 --> 00:00:47,640 Speaker 1: AI work had gotten pretty darn scarce in the eighties 13 00:00:47,640 --> 00:00:51,080 Speaker 1: and nineties. So while he and his team were prepping 14 00:00:51,159 --> 00:00:55,319 Speaker 1: for their ninety four demonstrations that were so impressive, there 15 00:00:55,360 --> 00:00:58,480 Speaker 1: was similar work going on in the United States. From 16 00:00:59,640 --> 00:01:04,720 Speaker 1: to nine. Research teams took advantage of six hundred fifty 17 00:01:04,760 --> 00:01:08,400 Speaker 1: million dollars of funding created by the US government through 18 00:01:08,400 --> 00:01:13,039 Speaker 1: the Intermodal Surface Transportation Efficiency Act. That act was meant 19 00:01:13,080 --> 00:01:18,840 Speaker 1: to support planning for several transportation projects, uh including the 20 00:01:18,880 --> 00:01:21,840 Speaker 1: creation of new rail lines between various cities. At the 21 00:01:21,840 --> 00:01:26,280 Speaker 1: autonomous research was really just one piece of this larger legislation. 22 00:01:26,360 --> 00:01:28,240 Speaker 1: And also just to let you guys know, in case 23 00:01:28,280 --> 00:01:31,600 Speaker 1: you're curious, many of those projects did not receive enough 24 00:01:31,600 --> 00:01:35,199 Speaker 1: funding to really accomplish their goals, so they never really 25 00:01:35,240 --> 00:01:38,839 Speaker 1: self fruition. It's one of the downsides of government projects 26 00:01:38,880 --> 00:01:42,600 Speaker 1: is that the budgets are too low. Nothing of real 27 00:01:42,760 --> 00:01:45,720 Speaker 1: import tends to happen. People might get paid by some 28 00:01:45,760 --> 00:01:50,640 Speaker 1: of that money, but the actual projects don't necessarily, you know, happen. 29 00:01:51,920 --> 00:01:55,360 Speaker 1: On the driver list car side, nine organizations began to 30 00:01:55,360 --> 00:01:58,840 Speaker 1: work on technologies to make driverless cars a reality, but 31 00:01:58,920 --> 00:02:02,480 Speaker 1: the deadline for meeting those goals in a suitable demonstration 32 00:02:02,920 --> 00:02:06,919 Speaker 1: was and from what I can tell, not a whole 33 00:02:06,960 --> 00:02:09,520 Speaker 1: lot of progress was made in that time, or at 34 00:02:09,600 --> 00:02:12,640 Speaker 1: least not enough to end up with an impressive demonstration. 35 00:02:12,760 --> 00:02:16,280 Speaker 1: That marked a point in history that we would say, 36 00:02:16,360 --> 00:02:20,720 Speaker 1: this is really important in driver lest cars. But we're 37 00:02:20,880 --> 00:02:24,720 Speaker 1: continued on the various systems that would be necessary to 38 00:02:24,840 --> 00:02:28,799 Speaker 1: make driverless cars. But you would argue that driverless car 39 00:02:28,880 --> 00:02:32,280 Speaker 1: research and development as a whole, as a as a 40 00:02:32,320 --> 00:02:35,720 Speaker 1: goal in of itself, was kind of shelved for a while. 41 00:02:36,320 --> 00:02:39,880 Speaker 1: There were people working on various parts of the problem. 42 00:02:39,880 --> 00:02:42,239 Speaker 1: Most of them were working in fields that ultimately would 43 00:02:42,280 --> 00:02:45,520 Speaker 1: prove important to the development of driverless cars. But again, 44 00:02:45,960 --> 00:02:49,480 Speaker 1: they were doing this for other reasons, and it wasn't 45 00:02:49,480 --> 00:02:53,680 Speaker 1: necessarily as uh an in goal of creating an autonomous vehicle. 46 00:02:54,680 --> 00:02:58,120 Speaker 1: Much of the excitement and money around the concept had 47 00:02:58,200 --> 00:03:02,520 Speaker 1: died down by the mid to late nineties. Some of 48 00:03:02,560 --> 00:03:05,679 Speaker 1: those related research projects would end up growing out of 49 00:03:05,720 --> 00:03:08,680 Speaker 1: a US backed R and D project called the Strategic 50 00:03:08,840 --> 00:03:12,200 Speaker 1: Computing Initiative or s c I now that actually got 51 00:03:12,200 --> 00:03:17,440 Speaker 1: started in ninety three. The goal of the initiative was 52 00:03:17,560 --> 00:03:21,160 Speaker 1: incredibly ambitious. That's a very kind way of putting it. 53 00:03:21,160 --> 00:03:26,720 Speaker 1: I'd actually say it was unrealistically ambitious. It was to 54 00:03:26,840 --> 00:03:30,520 Speaker 1: develop the technology to create a machine learning system capable 55 00:03:30,639 --> 00:03:35,520 Speaker 1: of running quote, ten billion instructions per second to see here, 56 00:03:35,840 --> 00:03:39,440 Speaker 1: speak and think like a human. The degree of integration 57 00:03:39,880 --> 00:03:43,560 Speaker 1: required would rival that achieved by the human brain, the 58 00:03:43,600 --> 00:03:47,720 Speaker 1: most complex instrument known to man. End quote. That was 59 00:03:47,720 --> 00:03:51,400 Speaker 1: actually written in an account by Alex Rowland and Philip 60 00:03:51,440 --> 00:03:54,720 Speaker 1: Shiman for the m I T Press about the s 61 00:03:54,720 --> 00:03:58,840 Speaker 1: c I project. The goal was to achieve that in 62 00:03:59,040 --> 00:04:02,160 Speaker 1: just a deck aid so starting in nineteen eighty three, 63 00:04:02,240 --> 00:04:05,120 Speaker 1: the plan was that by ninetee, we're gonna have a 64 00:04:05,120 --> 00:04:10,840 Speaker 1: computer that can think and and experience like a human can. Wowsers. 65 00:04:11,680 --> 00:04:14,160 Speaker 1: So it's pretty obvious by the late eighties that this 66 00:04:14,280 --> 00:04:16,560 Speaker 1: was not gonna happen. They were not going to hit 67 00:04:16,600 --> 00:04:20,600 Speaker 1: that goal, and in retrospect, we would probably call this 68 00:04:20,680 --> 00:04:24,080 Speaker 1: the result of hubrists. The human brain is far more 69 00:04:24,240 --> 00:04:28,800 Speaker 1: complicated and technology is far more limited than we gave 70 00:04:28,960 --> 00:04:33,440 Speaker 1: either credit for back in nineteen eighty three. Nevertheless, the 71 00:04:33,480 --> 00:04:38,120 Speaker 1: Defense Advanced Research Projects Agency or DARPA and the Department 72 00:04:38,160 --> 00:04:42,159 Speaker 1: of Defense, which is the department that oversees DARPA, poured 73 00:04:42,200 --> 00:04:46,320 Speaker 1: about a billion dollars into funding various programs throughout the 74 00:04:46,360 --> 00:04:50,400 Speaker 1: United States in an effort to achieve this goal. And 75 00:04:50,440 --> 00:04:52,880 Speaker 1: while we did not get a computer that thinks like 76 00:04:52,920 --> 00:04:57,760 Speaker 1: a person out of this whole process, many influential computer 77 00:04:57,880 --> 00:05:02,560 Speaker 1: scientists and research projects were able to advance our capabilities 78 00:05:02,640 --> 00:05:07,280 Speaker 1: and understanding through their work which got funding from this project. 79 00:05:07,320 --> 00:05:12,080 Speaker 1: So we didn't get what the project was aimed to produce, 80 00:05:12,800 --> 00:05:16,480 Speaker 1: but we did benefit from it. Some of that work 81 00:05:16,480 --> 00:05:19,800 Speaker 1: would become really important for the next big DARPA initiative 82 00:05:19,880 --> 00:05:22,840 Speaker 1: that I'm going to cover, and that is the Grand Challenge. 83 00:05:23,560 --> 00:05:27,000 Speaker 1: The history of the Grand Challenge dates back to two 84 00:05:27,000 --> 00:05:31,040 Speaker 1: thousand one. At that time, the United States Congress had 85 00:05:31,080 --> 00:05:34,520 Speaker 1: its own challenge for the various branches of the U. S. Military. 86 00:05:34,839 --> 00:05:38,440 Speaker 1: The US Congress said, we want you to develop the 87 00:05:38,520 --> 00:05:43,400 Speaker 1: technology necessary to allow one third of all military ground 88 00:05:43,600 --> 00:05:48,400 Speaker 1: combat vehicles to be unscrewed by two thousand fifteen. In 89 00:05:48,400 --> 00:05:56,039 Speaker 1: other words, to have autonomous ground combat vehicles one third 90 00:05:56,080 --> 00:05:58,880 Speaker 1: of all of them. The purpose, obviously was to keep 91 00:05:59,040 --> 00:06:01,760 Speaker 1: soldiers out of harm's way as much as possible. That 92 00:06:01,839 --> 00:06:05,920 Speaker 1: if you could have these ground combat vehicles operate autonomously, 93 00:06:06,480 --> 00:06:09,920 Speaker 1: then if one gets destroyed, that's a huge amount of 94 00:06:09,960 --> 00:06:12,280 Speaker 1: money gone down the drain, but no one dies, at 95 00:06:12,360 --> 00:06:16,359 Speaker 1: least no one on your side dies. Achieving this goal 96 00:06:16,839 --> 00:06:21,400 Speaker 1: would also be super difficult to do. Technology just wasn't 97 00:06:21,440 --> 00:06:24,760 Speaker 1: where it would need to be by itself. The defense 98 00:06:24,839 --> 00:06:28,159 Speaker 1: contractors that DARPA would work with on these sort of 99 00:06:28,200 --> 00:06:32,599 Speaker 1: solutions weren't making the progress necessary in order to meet 100 00:06:32,640 --> 00:06:36,000 Speaker 1: that two thousand fifteen goal, and DARPA recognized this early on. 101 00:06:36,080 --> 00:06:40,000 Speaker 1: They said, this is just not gonna happen. So in 102 00:06:40,040 --> 00:06:42,800 Speaker 1: the final report for the two thousand four Grand Challenge, 103 00:06:42,839 --> 00:06:46,240 Speaker 1: which was the first of the three Grand Challenges around 104 00:06:46,400 --> 00:06:52,279 Speaker 1: autonomous cars, the logic that the agency laid out to 105 00:06:53,160 --> 00:06:58,120 Speaker 1: justify this challenge was this, while there have been a 106 00:06:58,240 --> 00:07:03,240 Speaker 1: number of significant technical breakthroughs leading to robust unmanned air 107 00:07:03,360 --> 00:07:08,200 Speaker 1: vehicles that US forces used today, progress in unmanned autonomous 108 00:07:08,240 --> 00:07:12,080 Speaker 1: ground vehicle technology has not occurred at a similar rate. 109 00:07:12,680 --> 00:07:15,920 Speaker 1: Vehicle operations in a ground environment are a much more 110 00:07:16,040 --> 00:07:20,600 Speaker 1: difficult challenge due to terrain, man made obstacles, and weather 111 00:07:21,520 --> 00:07:25,760 Speaker 1: that's just scratching the surface. They actually were pretty you know, 112 00:07:25,920 --> 00:07:29,400 Speaker 1: generous in that regard, because as it turns out, there 113 00:07:29,400 --> 00:07:32,000 Speaker 1: are a lot of other factors that make this a 114 00:07:32,120 --> 00:07:37,120 Speaker 1: really difficult problem. And I think one other thing quick 115 00:07:37,160 --> 00:07:40,760 Speaker 1: tangent that I think is important here is DARPA pointing out, yes, 116 00:07:40,840 --> 00:07:45,200 Speaker 1: we came up with autonomous systems for aircraft ages ago. 117 00:07:45,560 --> 00:07:48,440 Speaker 1: We have lots of them. We have unmanned drones that 118 00:07:48,480 --> 00:07:51,760 Speaker 1: we can fly, but we haven't really done that with cars. 119 00:07:51,880 --> 00:07:55,480 Speaker 1: I think that's also a good reminder that technology does 120 00:07:55,520 --> 00:07:59,600 Speaker 1: not all progress at the same accelerated rate. We have 121 00:07:59,720 --> 00:08:02,000 Speaker 1: more is law, which has kind of conditioned us to 122 00:08:02,040 --> 00:08:06,840 Speaker 1: think about our technology advancing rapidly over the course of 123 00:08:06,880 --> 00:08:11,120 Speaker 1: every two years. But that doesn't apply to every technology. 124 00:08:11,160 --> 00:08:17,360 Speaker 1: It applies specifically these days to computational power, and originally 125 00:08:18,200 --> 00:08:20,600 Speaker 1: only dealt with the number of transistors you could fit 126 00:08:20,640 --> 00:08:24,840 Speaker 1: on a square inch of silicon wafer. So this is 127 00:08:24,840 --> 00:08:27,240 Speaker 1: good to remember because I think a lot of futurists 128 00:08:27,280 --> 00:08:30,680 Speaker 1: just kind of apply More's law to all technologies and 129 00:08:30,760 --> 00:08:34,120 Speaker 1: just assume that everything is accelerated at that saint or 130 00:08:34,160 --> 00:08:37,400 Speaker 1: everything is moving at that same accelerated rate. Back to 131 00:08:37,600 --> 00:08:42,520 Speaker 1: autonomous cars. Jose Negron, who worked at DARPA at the 132 00:08:42,559 --> 00:08:46,320 Speaker 1: time of the Grand Challenges, went to the director of 133 00:08:46,400 --> 00:08:49,839 Speaker 1: DARPA with an idea, and he said, there are a 134 00:08:49,880 --> 00:08:53,080 Speaker 1: lot of people out there who could potentially contribute to 135 00:08:53,160 --> 00:08:58,920 Speaker 1: the advancement of technology we're going to need for autonomous cars. 136 00:08:58,960 --> 00:09:03,120 Speaker 1: But we normally would work with them because they're independent innovators, 137 00:09:03,160 --> 00:09:05,880 Speaker 1: They work with smaller groups, they aren't part of defense 138 00:09:05,960 --> 00:09:09,600 Speaker 1: contractor companies, and so they don't often get a chance 139 00:09:09,640 --> 00:09:11,800 Speaker 1: to do any sort of work for the Department of Defense. 140 00:09:11,840 --> 00:09:14,000 Speaker 1: But we could take advantage of them if we are 141 00:09:14,080 --> 00:09:16,400 Speaker 1: if we give them the chance to participate, we could 142 00:09:16,480 --> 00:09:19,760 Speaker 1: benefit from that and they might create the breakthroughs that 143 00:09:19,800 --> 00:09:24,280 Speaker 1: we need to meet that two thousand fifteen deadline. And so, 144 00:09:24,520 --> 00:09:27,960 Speaker 1: after a couple of years of funding various research projects, 145 00:09:28,000 --> 00:09:31,319 Speaker 1: the then director of DARPA was a guy named Tony Teather. 146 00:09:31,679 --> 00:09:35,160 Speaker 1: Anthony Tather announced a new initiative and it was for 147 00:09:35,240 --> 00:09:39,480 Speaker 1: an open challenge to any teams that wanted to participate. 148 00:09:39,559 --> 00:09:44,559 Speaker 1: Anyone could apply to be part of this challenge. Specifically, 149 00:09:45,240 --> 00:09:49,320 Speaker 1: the challenge was to develop autonomous vehicle technology and incorporate 150 00:09:49,400 --> 00:09:53,240 Speaker 1: it into a vehicle that could traverse a one forty 151 00:09:53,280 --> 00:09:58,240 Speaker 1: two mile course without human intervention. So from the start 152 00:09:58,280 --> 00:10:00,280 Speaker 1: of the course to the end of it, the vehicle 153 00:10:00,320 --> 00:10:03,320 Speaker 1: would have to be operating under its own abilities, its 154 00:10:03,320 --> 00:10:07,000 Speaker 1: own power. The team to complete this course in the 155 00:10:07,040 --> 00:10:12,760 Speaker 1: shortest amount of time would take home a million dollars. Now, initially, 156 00:10:13,360 --> 00:10:16,440 Speaker 1: Anthony Tather was skeptical that this announcement was going to 157 00:10:16,520 --> 00:10:19,520 Speaker 1: draw much interest. He said, we'll get a shot, but 158 00:10:19,559 --> 00:10:21,640 Speaker 1: I don't think very many people are going to respond. 159 00:10:22,240 --> 00:10:25,840 Speaker 1: DARPA scheduled a kickoff event at the Peterson Automotive Museum 160 00:10:25,840 --> 00:10:30,400 Speaker 1: in Los Angeles, California for anyone interested in participating. On 161 00:10:30,480 --> 00:10:33,599 Speaker 1: the day of that event, Anthony Tather arrived at the 162 00:10:33,679 --> 00:10:37,200 Speaker 1: venue a half hour before the scheduled time to start, 163 00:10:37,720 --> 00:10:40,360 Speaker 1: and he discovered there was a line wrapping around the 164 00:10:40,400 --> 00:10:44,240 Speaker 1: block of people who were interested in participating. According to 165 00:10:44,240 --> 00:10:46,920 Speaker 1: the final report for the two thousand four Challenge, more 166 00:10:46,960 --> 00:10:50,479 Speaker 1: than four hundred people showed up to that first conference. 167 00:10:51,360 --> 00:10:55,120 Speaker 1: The race course for this two thousand four event would 168 00:10:55,200 --> 00:10:59,760 Speaker 1: end up being incredibly challenging. A guy named Salve Fish 169 00:11:00,120 --> 00:11:03,280 Speaker 1: actually designed the course and he made it really tough. 170 00:11:03,720 --> 00:11:07,640 Speaker 1: Vehicles would travel through the desert, including trips up and 171 00:11:07,760 --> 00:11:11,760 Speaker 1: down hills and through switchbacks. Sometimes the road would narrow 172 00:11:11,800 --> 00:11:14,600 Speaker 1: down to about ten feet wide. There are points where 173 00:11:14,640 --> 00:11:17,520 Speaker 1: the road would drop off several feet to one side 174 00:11:17,800 --> 00:11:21,520 Speaker 1: or the other. The course crossed railroad tracks. There was 175 00:11:21,559 --> 00:11:25,320 Speaker 1: the chance that the cars might encounter animals along the course. 176 00:11:26,040 --> 00:11:29,920 Speaker 1: DARPA had some professional drivers actually go down this course 177 00:11:30,000 --> 00:11:33,040 Speaker 1: after it had been established to kind of give their 178 00:11:33,160 --> 00:11:36,560 Speaker 1: estimation of it, and they said it was fairly challenging 179 00:11:36,600 --> 00:11:40,640 Speaker 1: even for a professional trained off road driver. Any autonomous 180 00:11:40,640 --> 00:11:42,880 Speaker 1: car that was to complete this course was expected to 181 00:11:42,920 --> 00:11:46,960 Speaker 1: do so within ten hours of starting, so there's a 182 00:11:47,000 --> 00:11:51,880 Speaker 1: ten hour time limit essentially from start to finish. So 183 00:11:51,920 --> 00:11:54,920 Speaker 1: why did they make it so challenging. Well, keep in mind, 184 00:11:55,040 --> 00:11:58,360 Speaker 1: the ultimate goal of this and the following challenges was 185 00:11:58,400 --> 00:12:01,840 Speaker 1: to encourage the development of technology that would allow the 186 00:12:01,920 --> 00:12:06,439 Speaker 1: US military to build autonomous military vehicles. And the US 187 00:12:06,720 --> 00:12:10,440 Speaker 1: was and still is, heavily involved in operations in the 188 00:12:10,440 --> 00:12:13,400 Speaker 1: Middle East, and so the course was partly designed to 189 00:12:13,480 --> 00:12:17,440 Speaker 1: mimic the conditions that military vehicles might regularly encounter in 190 00:12:17,480 --> 00:12:21,320 Speaker 1: that part of the world. And while on a test 191 00:12:21,440 --> 00:12:24,680 Speaker 1: we might say, while that's really challenging, in a real 192 00:12:24,720 --> 00:12:30,319 Speaker 1: world scenario, we don't really. We can't change the parameters. 193 00:12:30,400 --> 00:12:33,280 Speaker 1: It's it is what it is. So it needed to 194 00:12:33,320 --> 00:12:35,880 Speaker 1: be really hard. It needed to be hard also to 195 00:12:35,960 --> 00:12:40,319 Speaker 1: spur on that innovation. The course would begin in Barstow, California, 196 00:12:40,360 --> 00:12:43,640 Speaker 1: and it would end in prem Nevada, passing through the 197 00:12:43,640 --> 00:12:47,360 Speaker 1: Mojave Desert. All right, I'm gonna talk about how this 198 00:12:47,480 --> 00:12:50,080 Speaker 1: challenge unfolded, but first let's take a quick break to 199 00:12:50,160 --> 00:13:03,200 Speaker 1: thank our sponsor. The teams had just one year to develop, build, test, 200 00:13:03,400 --> 00:13:06,800 Speaker 1: and complete their technologies. So, like I said, this was 201 00:13:07,240 --> 00:13:13,560 Speaker 1: an enormous challenge. Competing teams included universities, there were research facilities, 202 00:13:13,880 --> 00:13:17,040 Speaker 1: There are some company teams. There were even some high 203 00:13:17,080 --> 00:13:23,559 Speaker 1: school teams that applied. DARPA received one hundred six applications. Now, 204 00:13:23,600 --> 00:13:26,760 Speaker 1: not everyone who attended the kickoff event went through with 205 00:13:26,840 --> 00:13:30,200 Speaker 1: the step of applying, and the rules did restrict who 206 00:13:30,200 --> 00:13:34,199 Speaker 1: could participate. They said US federal government organizations could not 207 00:13:34,400 --> 00:13:38,760 Speaker 1: participate in this event. Federal employees could participate, but only 208 00:13:38,840 --> 00:13:41,600 Speaker 1: as a private citizen on a team. They could not 209 00:13:41,840 --> 00:13:46,160 Speaker 1: represent a federally backed team. Federal funding wasn't allowed either, 210 00:13:46,520 --> 00:13:50,600 Speaker 1: and teams were prohibited from using government owned equipment, with 211 00:13:50,679 --> 00:13:53,880 Speaker 1: one exception, which was that if the government were to 212 00:13:54,080 --> 00:13:58,439 Speaker 1: offer equipment to all teams, that's fine, but no team 213 00:13:58,440 --> 00:14:02,000 Speaker 1: could just take advantage of government owned equipment if they 214 00:14:02,000 --> 00:14:04,800 Speaker 1: were the only team that had access to it. DARPA 215 00:14:04,880 --> 00:14:07,920 Speaker 1: set up a website for the competition that included a 216 00:14:07,920 --> 00:14:11,360 Speaker 1: forum and which interested parties could post questions. They could 217 00:14:11,400 --> 00:14:15,080 Speaker 1: share advice, They could share strategies for how they were 218 00:14:15,080 --> 00:14:19,640 Speaker 1: going to solve really hard problems like obstacle detection, path finding, 219 00:14:20,040 --> 00:14:24,200 Speaker 1: position location, the control software that would be needed for 220 00:14:24,280 --> 00:14:28,720 Speaker 1: the vehicle itself. Teams could also help each other find sponsors, 221 00:14:29,080 --> 00:14:32,400 Speaker 1: to help get money and funding to cover expenses, and 222 00:14:32,480 --> 00:14:35,200 Speaker 1: it sounds to me like it was a fairly collaborative 223 00:14:35,240 --> 00:14:39,480 Speaker 1: space despite the competitive nature of the objective. The general 224 00:14:39,520 --> 00:14:43,800 Speaker 1: strategy for most teams was to pair several technologies together 225 00:14:44,000 --> 00:14:48,360 Speaker 1: to create an autonomous car, so they weren't necessarily developing 226 00:14:48,440 --> 00:14:51,000 Speaker 1: all these tools themselves. In some cases, they were making 227 00:14:51,120 --> 00:14:53,840 Speaker 1: use of existing tools, so a lot of them would 228 00:14:53,920 --> 00:14:57,480 Speaker 1: use stuff like a GPS receiver. I've talked about GPS 229 00:14:57,600 --> 00:15:00,360 Speaker 1: extensively not too long ago, so I'm not going to 230 00:15:00,440 --> 00:15:05,200 Speaker 1: go over that again. Also optical sensors, so essentially cameras 231 00:15:05,200 --> 00:15:10,200 Speaker 1: and similar sensors, laser range finders, lidar systems, things like that. 232 00:15:10,680 --> 00:15:13,880 Speaker 1: Some teams spent most of their time developing the computer 233 00:15:13,920 --> 00:15:18,000 Speaker 1: programs that would make decisions based on the incoming data 234 00:15:18,080 --> 00:15:23,680 Speaker 1: from these various sensors, such as whether a vehicle should 235 00:15:23,680 --> 00:15:26,640 Speaker 1: accelerate to climb a steep hill or to break to 236 00:15:26,760 --> 00:15:29,400 Speaker 1: avoid going off the edge of the road, or to 237 00:15:29,560 --> 00:15:33,200 Speaker 1: steer in a particular direction. One team created a system 238 00:15:33,200 --> 00:15:36,120 Speaker 1: that used a voting mechanism to decide on what to 239 00:15:36,200 --> 00:15:39,880 Speaker 1: do at any given point, and it would wait various 240 00:15:39,920 --> 00:15:43,080 Speaker 1: factors in that decision making process so that the computer 241 00:15:43,160 --> 00:15:46,280 Speaker 1: could arrive at what it thought was the best course 242 00:15:46,320 --> 00:15:50,480 Speaker 1: of action for any given set of circumstances. DARPA chose 243 00:15:50,880 --> 00:15:55,320 Speaker 1: twenty five of the teams after reviewing their technical applications, 244 00:15:56,120 --> 00:15:59,720 Speaker 1: and immediately upon looking at the technical applications, they were 245 00:15:59,760 --> 00:16:03,080 Speaker 1: able to select nineteen teams by itself, They said, all right, 246 00:16:03,160 --> 00:16:08,880 Speaker 1: these nineteen really have a solid application. For the other six. 247 00:16:08,920 --> 00:16:12,760 Speaker 1: They visited several of those teams for site visits, kind 248 00:16:12,800 --> 00:16:14,600 Speaker 1: of to get an idea of how far along each 249 00:16:14,640 --> 00:16:18,360 Speaker 1: project was, how realistic the chances were that the team 250 00:16:18,440 --> 00:16:21,680 Speaker 1: was going to be ready by race time, and generally 251 00:16:21,680 --> 00:16:23,520 Speaker 1: just get a gut feeling for who had a real 252 00:16:23,680 --> 00:16:27,840 Speaker 1: chance to to do something special at this competition. The 253 00:16:27,840 --> 00:16:31,560 Speaker 1: teams were invited to a qualifying event at the California 254 00:16:31,600 --> 00:16:36,160 Speaker 1: Speedway in Fontana, California, and of the twenty five that 255 00:16:36,200 --> 00:16:40,320 Speaker 1: were selected, twenty one showed up. Four teams did not 256 00:16:40,520 --> 00:16:44,440 Speaker 1: make it to the qualifying event. The qualifying event tested 257 00:16:44,520 --> 00:16:49,960 Speaker 1: vehicles in many ways, including responsiveness, speed, safety. They went 258 00:16:50,000 --> 00:16:54,320 Speaker 1: through some areas that would simulate what the cars would 259 00:16:54,400 --> 00:16:59,320 Speaker 1: encounter on the actual race. Only seven teams were actually 260 00:16:59,360 --> 00:17:03,000 Speaker 1: able to navig Gate the full qualifier, but DARPA selected 261 00:17:03,040 --> 00:17:05,880 Speaker 1: another eight on top of the seven that made it through. 262 00:17:06,400 --> 00:17:09,639 Speaker 1: They were close enough for consideration, and so the on 263 00:17:10,160 --> 00:17:13,920 Speaker 1: plus teams that had applied the competition was narrowed down 264 00:17:13,960 --> 00:17:18,000 Speaker 1: to fifteen teams. For the big race itself, DARPA had 265 00:17:18,080 --> 00:17:22,159 Speaker 1: some other special rules. One was that each team would 266 00:17:22,160 --> 00:17:25,840 Speaker 1: not be given any information about the course until about 267 00:17:25,920 --> 00:17:29,520 Speaker 1: two hours before the actual start time. Another was that 268 00:17:30,119 --> 00:17:33,960 Speaker 1: a human driven chase vehicle would follow each of the 269 00:17:34,000 --> 00:17:37,560 Speaker 1: autonomous vehicles, and inside that chase vehicle there would be 270 00:17:37,560 --> 00:17:40,800 Speaker 1: a laptop operated by a team member that would allow 271 00:17:40,880 --> 00:17:44,959 Speaker 1: them to activate a kill switch. So if an autonomous 272 00:17:45,000 --> 00:17:48,320 Speaker 1: car or motorcycle more on that in a second where 273 00:17:48,359 --> 00:17:51,400 Speaker 1: to go nuts, start careening off course and driving out 274 00:17:51,400 --> 00:17:54,399 Speaker 1: of control, a human in the chase vehicle could hit 275 00:17:54,400 --> 00:17:57,680 Speaker 1: the kill switch and that would shut down the target 276 00:17:57,800 --> 00:18:01,639 Speaker 1: vehicle before any real calamity could occur. The cars weren't 277 00:18:01,640 --> 00:18:03,920 Speaker 1: meant all to leave at the same time. It wasn't 278 00:18:03,960 --> 00:18:06,600 Speaker 1: like that kind of race where they all lined up 279 00:18:06,640 --> 00:18:10,200 Speaker 1: and then the flag dropped. There was a staggered departure 280 00:18:10,240 --> 00:18:13,119 Speaker 1: time UH, and it was staggered by several minutes so 281 00:18:13,200 --> 00:18:16,399 Speaker 1: the goal was to have the fastest time from start 282 00:18:16,440 --> 00:18:20,399 Speaker 1: to finish, but not necessarily to physically race against another 283 00:18:20,480 --> 00:18:24,680 Speaker 1: vehicle in real time, although there was the potential that 284 00:18:25,000 --> 00:18:27,600 Speaker 1: a car could catch up with whichever vehicles were in 285 00:18:27,640 --> 00:18:29,639 Speaker 1: front of it and pass it that way, that was 286 00:18:29,680 --> 00:18:35,320 Speaker 1: a possibility. DARPA told CNN that about five people were 287 00:18:35,400 --> 00:18:38,960 Speaker 1: working on the race in total, some of them were volunteers, 288 00:18:39,040 --> 00:18:42,160 Speaker 1: some of them were DARPEST staff members, and that the 289 00:18:42,280 --> 00:18:47,600 Speaker 1: race also employed several wreckers tow trucks. Essentially, so if 290 00:18:47,720 --> 00:18:50,760 Speaker 1: vehicles got stuck, a wrecker could come in and pull 291 00:18:50,840 --> 00:18:54,440 Speaker 1: them out of the way, and that vehicle would obviously 292 00:18:54,480 --> 00:18:59,720 Speaker 1: be uh. It would be eliminated from the race. Uh 293 00:18:59,760 --> 00:19:02,960 Speaker 1: and whoever pass that finished line in the shortest amount 294 00:19:03,000 --> 00:19:05,560 Speaker 1: of time would walk away with a cool one million 295 00:19:05,640 --> 00:19:08,760 Speaker 1: dollars in prize money. Now, one of the participants in 296 00:19:08,760 --> 00:19:11,720 Speaker 1: this race, just as a moment of interest here was 297 00:19:11,760 --> 00:19:15,800 Speaker 1: a guy named Anthony Lewandowski. And I've talked about Lewandowski 298 00:19:15,960 --> 00:19:18,120 Speaker 1: in tech stuff. I think it was a year ago 299 00:19:18,160 --> 00:19:21,920 Speaker 1: when I was talking about Lewandowski. His approach was unique 300 00:19:21,960 --> 00:19:25,520 Speaker 1: in that he brought a self balancing motorcycle to the 301 00:19:25,600 --> 00:19:30,560 Speaker 1: event rather than a four wheeled vehicle. Lewandowski became somewhat 302 00:19:30,680 --> 00:19:36,560 Speaker 1: infamous for departing Google he worked in their autonomous vehicle 303 00:19:36,960 --> 00:19:40,120 Speaker 1: department and then going to work for Uber and then 304 00:19:40,160 --> 00:19:45,840 Speaker 1: allegedly bringing along with him some proprietary information from Google. 305 00:19:46,680 --> 00:19:49,080 Speaker 1: So that was what I was talking about the last 306 00:19:49,080 --> 00:19:53,160 Speaker 1: time I chatted about Lewandowski. The fifteen teams, in order 307 00:19:53,280 --> 00:19:57,080 Speaker 1: of their start time were the Red Team from Carnegie 308 00:19:57,119 --> 00:20:03,160 Speaker 1: Mellon University, PSI Autonics two from Thousand Oaks, California, Team 309 00:20:03,280 --> 00:20:09,080 Speaker 1: cal Tech from California Technical University, Digital Auto Drive or 310 00:20:09,240 --> 00:20:14,760 Speaker 1: Dad from Morgan Hill, California, Virginia Tech which is from Virginia, 311 00:20:15,240 --> 00:20:20,560 Speaker 1: Axion Racing from Westlake Village, California, Team Cajun Bot which 312 00:20:20,640 --> 00:20:25,359 Speaker 1: might have my favorite name there from Lafayette, Louisiana, Team 313 00:20:25,480 --> 00:20:30,439 Speaker 1: Endsco from Falls Church, Virginia, Team SIMAR that's see I 314 00:20:30,800 --> 00:20:33,399 Speaker 1: M A R from Gainesville, Florida, and Logan, Utah. It 315 00:20:33,440 --> 00:20:37,760 Speaker 1: was a joint project the Palos Verdes High School Road 316 00:20:37,800 --> 00:20:42,280 Speaker 1: Warriors from Palace Very Days Estates, California, PSI A Tonics 317 00:20:42,400 --> 00:20:46,760 Speaker 1: one from Thousand Oaks, so they actually left the starting 318 00:20:46,800 --> 00:20:51,520 Speaker 1: line well after PSI A Tonics too uh. Team Tera 319 00:20:51,600 --> 00:20:56,520 Speaker 1: Max from Oshkosh, Wisconsin, Team Tara Hawk from Guardina, California, 320 00:20:56,840 --> 00:21:00,359 Speaker 1: the Golem Group from Santa Monica, California, and the Blue 321 00:21:00,359 --> 00:21:04,080 Speaker 1: Team from Berkeley, California. At six thirty in the morning 322 00:21:04,520 --> 00:21:09,439 Speaker 1: on Saturday, March two thousand four, the Red Team's vehicle 323 00:21:09,640 --> 00:21:13,680 Speaker 1: called Sandstorm so It's a Sandstorm from Carnegie Mellon became 324 00:21:13,720 --> 00:21:17,160 Speaker 1: the first of the driver lest cars to tackle this course. Now, 325 00:21:17,240 --> 00:21:20,480 Speaker 1: it would get about seven and a half miles down 326 00:21:20,520 --> 00:21:23,320 Speaker 1: the road, but then it got stuck while trying to 327 00:21:23,400 --> 00:21:28,240 Speaker 1: navigate a tight switchback. It was the most successful of 328 00:21:28,280 --> 00:21:31,119 Speaker 1: all the vehicles, seven and a half miles out of 329 00:21:31,119 --> 00:21:35,119 Speaker 1: a forty two So out of the fifteen cars that 330 00:21:35,680 --> 00:21:39,080 Speaker 1: made it through to the competition, two of them withdrew 331 00:21:39,280 --> 00:21:41,199 Speaker 1: prior to the start of the race. That would be 332 00:21:41,200 --> 00:21:44,520 Speaker 1: Tara Hawk and Blue Team. Four of them didn't make 333 00:21:44,520 --> 00:21:47,360 Speaker 1: it out of the starting area because of various problems 334 00:21:47,480 --> 00:21:51,280 Speaker 1: ranging from steering abnormalities to colliding with wall, which I 335 00:21:51,280 --> 00:21:54,960 Speaker 1: guess in its way as its own steering abnormality. Then 336 00:21:55,000 --> 00:21:57,480 Speaker 1: you had Team Ends Goes vehicle which made it point 337 00:21:57,520 --> 00:22:02,000 Speaker 1: two miles before it flipped over. Team Simar's car got 338 00:22:02,040 --> 00:22:05,800 Speaker 1: tangled up by wire. Half a mile in the Golden 339 00:22:05,840 --> 00:22:08,960 Speaker 1: Group's vehicle got stuck on an incline and couldn't provide 340 00:22:09,040 --> 00:22:12,600 Speaker 1: enough throttle to overcome it. Team Terra Max's vehicle kept 341 00:22:12,600 --> 00:22:16,159 Speaker 1: detecting bushes and eventually just stopped moving forward after it 342 00:22:16,480 --> 00:22:20,359 Speaker 1: thought that there were bushes everywhere. I guess wouldn't budge. 343 00:22:20,800 --> 00:22:23,200 Speaker 1: Team cal Tex car went off course and through a 344 00:22:23,200 --> 00:22:25,480 Speaker 1: fence and could not get back on course, and so 345 00:22:25,520 --> 00:22:28,960 Speaker 1: it was disqualified. So none of the vehicles were able 346 00:22:29,000 --> 00:22:32,399 Speaker 1: to complete this on two mile course, not by a 347 00:22:32,520 --> 00:22:37,879 Speaker 1: long shot. No one collected the million dollars. But DARPA 348 00:22:38,000 --> 00:22:40,600 Speaker 1: wasn't about to give up there. For one thing, the 349 00:22:40,640 --> 00:22:46,280 Speaker 1: agency had created an incredibly ambitious challenge, a really hard one, 350 00:22:46,720 --> 00:22:49,800 Speaker 1: and for another, the teams were learning from their mistakes, 351 00:22:50,200 --> 00:22:53,600 Speaker 1: and so DARPA chose to announce a second challenge that 352 00:22:53,600 --> 00:22:56,560 Speaker 1: would take place in two thousand five. It would double 353 00:22:56,760 --> 00:23:00,359 Speaker 1: the prize money to two million dollars and the ameters 354 00:23:00,359 --> 00:23:03,320 Speaker 1: of the challenge would change slightly. Now, according to the 355 00:23:03,359 --> 00:23:06,920 Speaker 1: final report for the two thousand four challenge, the agency 356 00:23:07,040 --> 00:23:10,800 Speaker 1: quote believes it prudent to continue with the prize authority 357 00:23:10,840 --> 00:23:14,960 Speaker 1: approach and hold a second Grand Challenge for autonomous unmanned 358 00:23:15,200 --> 00:23:18,560 Speaker 1: ground vehicles in two thousand five. The prize authority approach 359 00:23:18,680 --> 00:23:21,600 Speaker 1: is meeting the goals of attracting new talent with new 360 00:23:21,640 --> 00:23:26,439 Speaker 1: ideas and accelerating advancement in robotic vehicle research. Without the 361 00:23:26,480 --> 00:23:29,119 Speaker 1: Grand Challenge, it is doubtful there would be much progress 362 00:23:29,119 --> 00:23:33,040 Speaker 1: without substantial new investment in accelerating research on autonomous ground 363 00:23:33,080 --> 00:23:37,440 Speaker 1: vehicles that could traverse difficult terrain at militarily relevant speeds 364 00:23:37,680 --> 00:23:42,320 Speaker 1: end quote. So how did that turn out? Well, I'll 365 00:23:42,320 --> 00:23:46,320 Speaker 1: tell you. First thing, get a drink of water, and 366 00:23:46,320 --> 00:23:56,880 Speaker 1: we're gonna thanks O our sponsors here. As Anthony Tether 367 00:23:57,040 --> 00:23:59,760 Speaker 1: would say later, the thing to remember was the two 368 00:23:59,760 --> 00:24:02,400 Speaker 1: thous and four Grand Challenge was something new. It had 369 00:24:02,440 --> 00:24:05,080 Speaker 1: never been done before, and it was an engineering challenge 370 00:24:05,119 --> 00:24:07,800 Speaker 1: open to anyone, and teams that were able to meet 371 00:24:07,800 --> 00:24:11,920 Speaker 1: the selection criteria were able to participate. So while there 372 00:24:12,119 --> 00:24:14,919 Speaker 1: was disappointment when no car was able to make it 373 00:24:14,960 --> 00:24:18,040 Speaker 1: over the big hill that Sandstorm got stuck on, it 374 00:24:18,080 --> 00:24:22,320 Speaker 1: didn't diminish the excitement for a second Grand Challenge. Returning 375 00:24:22,359 --> 00:24:25,280 Speaker 1: teams were energized by the need to outperform their previous 376 00:24:25,320 --> 00:24:29,440 Speaker 1: attempts and new teams came forward inspired by the original challenge. 377 00:24:29,520 --> 00:24:32,480 Speaker 1: So while it could have been the needle that would 378 00:24:32,560 --> 00:24:37,240 Speaker 1: deflate the autonomous balloon could lead to another artificial intelligence winter. 379 00:24:37,440 --> 00:24:39,840 Speaker 1: Like I said in the last episode, it actually got 380 00:24:39,840 --> 00:24:44,320 Speaker 1: more people excited about those possibilities. So in August two 381 00:24:44,320 --> 00:24:48,560 Speaker 1: thousand four, DARPA held another participants conference for parties that 382 00:24:48,600 --> 00:24:51,840 Speaker 1: were interested in either participating or sponsoring a team. Now, 383 00:24:51,880 --> 00:24:55,120 Speaker 1: remember it just held the Grand Challenge in March. Now 384 00:24:55,160 --> 00:24:57,680 Speaker 1: it's August and they're ready to talk about the next one. 385 00:24:57,920 --> 00:25:01,000 Speaker 1: The new challenge was scheduled for October eight, two thousand five, 386 00:25:01,440 --> 00:25:04,359 Speaker 1: and the rules allowed for flexible starting time, So if 387 00:25:04,400 --> 00:25:07,359 Speaker 1: the weather was bad or other conditions were such that 388 00:25:07,840 --> 00:25:10,160 Speaker 1: you couldn't really start in October eight, you would start 389 00:25:10,200 --> 00:25:13,040 Speaker 1: the next day or the next up to October eighth, 390 00:25:13,119 --> 00:25:15,600 Speaker 1: two thousand five. If it kept missing it up to 391 00:25:15,600 --> 00:25:18,000 Speaker 1: that point, the whole thing would be canceled. Now, according 392 00:25:18,040 --> 00:25:20,280 Speaker 1: to the rules, they said the route would be quote 393 00:25:20,320 --> 00:25:23,879 Speaker 1: no longer than one miles end quote. I guess that 394 00:25:23,960 --> 00:25:26,280 Speaker 1: was a relief since no one got past seven and 395 00:25:26,320 --> 00:25:28,600 Speaker 1: a half miles in the first one, and also said 396 00:25:28,600 --> 00:25:33,440 Speaker 1: the route would include quote paved roads, unpaved roads, trails, 397 00:25:33,720 --> 00:25:39,240 Speaker 1: and off road desert areas. Examples of obstacles include ditches, berms, washboard, 398 00:25:39,400 --> 00:25:45,160 Speaker 1: sandy ground, standing water, rocks and boulders, narrow underpasses, construction equipment, 399 00:25:45,440 --> 00:25:50,280 Speaker 1: concrete safety rails, power line towers, barbed wire fences, and 400 00:25:50,480 --> 00:25:54,480 Speaker 1: cattle guards end quote. The rules also stated that DARPA 401 00:25:54,520 --> 00:25:58,680 Speaker 1: could introduce obstacles onto the course. They could purposefully put 402 00:25:58,760 --> 00:26:01,400 Speaker 1: some obstacles in the way, but that the route would 403 00:26:01,400 --> 00:26:04,159 Speaker 1: be wide enough so that a vehicle could bypass the 404 00:26:04,200 --> 00:26:06,600 Speaker 1: obstacles without going off course. So they weren't going to 405 00:26:06,760 --> 00:26:10,440 Speaker 1: block a road entirely, but they would at least partially 406 00:26:10,440 --> 00:26:13,960 Speaker 1: block the road in certain locations. Now, if you want 407 00:26:14,040 --> 00:26:17,960 Speaker 1: proof that the two thousand four challenge did not discourage 408 00:26:17,960 --> 00:26:20,560 Speaker 1: IT participants, you just have to look at how many 409 00:26:20,600 --> 00:26:23,399 Speaker 1: people and teams applied to be part of the two 410 00:26:23,440 --> 00:26:26,800 Speaker 1: thousand five challenge. There were a hundred six people or 411 00:26:26,840 --> 00:26:29,119 Speaker 1: teams that applied in two thousand four and two thousand 412 00:26:29,119 --> 00:26:33,120 Speaker 1: five it was one hundred nine five teams, nearly twice 413 00:26:33,160 --> 00:26:36,240 Speaker 1: as many that showed interest the year before. Out of 414 00:26:36,240 --> 00:26:41,800 Speaker 1: those on DARPAT selected one eighteen teams for site visits. 415 00:26:42,359 --> 00:26:46,360 Speaker 1: Out of those on eighteen, they chose forty that were 416 00:26:46,400 --> 00:26:50,080 Speaker 1: selected to go on to the national qualification event, and 417 00:26:50,119 --> 00:26:54,000 Speaker 1: then they picked three more alternate teams that were added 418 00:26:54,000 --> 00:26:57,200 Speaker 1: in August two thousand five. Now, like the two thousand 419 00:26:57,240 --> 00:27:00,400 Speaker 1: four qualifying event, this one was designed to sim relates 420 00:27:00,400 --> 00:27:03,080 Speaker 1: some of the conditions for the final race, with the 421 00:27:03,119 --> 00:27:06,320 Speaker 1: goal of reducing the number of competitors to the top 422 00:27:06,440 --> 00:27:09,879 Speaker 1: twenty teams. All but one of the teams passed the 423 00:27:09,920 --> 00:27:13,640 Speaker 1: initial technical inspections to make certain the vehicles met all 424 00:27:13,720 --> 00:27:17,399 Speaker 1: the safety and performance parameters, and according to a news 425 00:27:17,440 --> 00:27:21,640 Speaker 1: release from October two thousand five, twenty two robotic vehicles 426 00:27:21,640 --> 00:27:24,119 Speaker 1: were able to get through the obstacle course designed to 427 00:27:24,200 --> 00:27:28,680 Speaker 1: simulate the final race. DARPA actually would select twenty three 428 00:27:28,680 --> 00:27:32,199 Speaker 1: teams to compete on the day of the race. They 429 00:27:32,240 --> 00:27:36,600 Speaker 1: included returning teams like Cajun Butt, cal Tech, and Team Dad, 430 00:27:37,080 --> 00:27:40,080 Speaker 1: and there were new teams to like Team Cornell, Desert 431 00:27:40,119 --> 00:27:45,320 Speaker 1: Buck Eyes, the Gray Team, Insight Racing, and Stanford Racing Team. 432 00:27:45,440 --> 00:27:48,960 Speaker 1: The final course for the two thousand five Challenge was 433 00:27:49,080 --> 00:27:52,200 Speaker 1: one thirty one point two miles, which is about two 434 00:27:52,240 --> 00:27:56,360 Speaker 1: hundred eleven kilometers. Out of the twenty three competing teams, 435 00:27:56,880 --> 00:28:01,080 Speaker 1: five teams finished the full course. Only four of them 436 00:28:01,119 --> 00:28:04,800 Speaker 1: did so within the ten hour time limit. The Terra Max, 437 00:28:04,840 --> 00:28:08,479 Speaker 1: which was fifth place, would finish in twelve hours fifty 438 00:28:08,480 --> 00:28:10,520 Speaker 1: one minutes, so it took a little too long to 439 00:28:11,119 --> 00:28:13,879 Speaker 1: make that ten hour time limit, but it's still finished. 440 00:28:14,359 --> 00:28:18,600 Speaker 1: All the other vehicles experienced either mechanical or software failures 441 00:28:18,640 --> 00:28:22,040 Speaker 1: somewhere along the route and we're unable to complete the 442 00:28:22,119 --> 00:28:25,440 Speaker 1: full course. The winning team of the two thousand five 443 00:28:25,520 --> 00:28:30,600 Speaker 1: challenge was the Stanford Racing Team and their vehicle named Stanley. 444 00:28:30,880 --> 00:28:34,840 Speaker 1: The average speed for Stanley was nineteen point one miles 445 00:28:34,840 --> 00:28:37,639 Speaker 1: per hour, which is about thirty one kilometers per hour. 446 00:28:38,200 --> 00:28:40,960 Speaker 1: Stanley was built on top of a Volkswagen tou Egg, 447 00:28:41,480 --> 00:28:45,280 Speaker 1: which could be largely operated by an onboard computer by 448 00:28:45,480 --> 00:28:49,720 Speaker 1: linking it to the vehicle's electrical system. However, they did 449 00:28:49,760 --> 00:28:53,479 Speaker 1: still have electro mechanical parts to operate the steering and 450 00:28:53,600 --> 00:28:58,040 Speaker 1: the gearshift, so it wasn't all done purely electronically yet. 451 00:28:58,480 --> 00:29:02,200 Speaker 1: Stanley had five lie Dar units that could gather data 452 00:29:02,480 --> 00:29:05,160 Speaker 1: for the vehicle's computer to build out a three dimensional 453 00:29:05,240 --> 00:29:08,640 Speaker 1: map of its surroundings and pair that with GPS information, 454 00:29:09,160 --> 00:29:12,920 Speaker 1: and there was some other positional equipment aboard Stanley as 455 00:29:12,960 --> 00:29:16,880 Speaker 1: well to help supplement the information gathered by the GPS receiver. 456 00:29:17,360 --> 00:29:21,480 Speaker 1: The computer system operated on Lenox, so Lenox fans out there, 457 00:29:22,160 --> 00:29:25,520 Speaker 1: that was the computer system that was running on Stanley. 458 00:29:25,560 --> 00:29:28,440 Speaker 1: According to Stanford, the code for the vehicle's behavior consisted 459 00:29:28,480 --> 00:29:32,120 Speaker 1: of a hundred thousand lines, and it's decision making process 460 00:29:32,160 --> 00:29:35,880 Speaker 1: was guided by machine learning. The team completed the course 461 00:29:36,240 --> 00:29:40,400 Speaker 1: in six hours fifty four minutes, and that meant that 462 00:29:40,480 --> 00:29:43,680 Speaker 1: they came in about eleven minutes ahead of second place. 463 00:29:44,120 --> 00:29:48,400 Speaker 1: Second place went to Red Team's Sandstorm vehicle. Again, that's 464 00:29:48,400 --> 00:29:51,600 Speaker 1: Carnegie Mellon University, and they were the one who once 465 00:29:51,640 --> 00:29:55,120 Speaker 1: who got the furthest the year before. But Red Team 466 00:29:55,120 --> 00:29:58,040 Speaker 1: had also fielded another card. They didn't just put in one, 467 00:29:58,120 --> 00:30:02,040 Speaker 1: they put in two, and the one was called Highlander 468 00:30:02,360 --> 00:30:05,480 Speaker 1: with a one in place of the eye. I guess 469 00:30:05,520 --> 00:30:08,840 Speaker 1: there can be only one, but anyway, they put forward 470 00:30:08,960 --> 00:30:11,560 Speaker 1: Highlander as well. That one came in third place, so 471 00:30:11,640 --> 00:30:15,120 Speaker 1: Carnegie Melon took second and third place. It was nine 472 00:30:15,160 --> 00:30:19,280 Speaker 1: minutes slower than Sandstorm, so it was about twenty minutes 473 00:30:19,360 --> 00:30:25,120 Speaker 1: behind the first place winner. Now interesting side note about Highlander. 474 00:30:25,880 --> 00:30:30,880 Speaker 1: Initially it was making great time, like really really good time. 475 00:30:30,960 --> 00:30:33,360 Speaker 1: It was on track to win the whole darn thing. 476 00:30:33,960 --> 00:30:37,720 Speaker 1: But about two hours after starting the race, the vehicle's 477 00:30:37,800 --> 00:30:40,960 Speaker 1: engine began to sputter a bit, and it struggled anytime 478 00:30:41,000 --> 00:30:43,360 Speaker 1: it hit a steep climb, and it was never able 479 00:30:43,400 --> 00:30:45,440 Speaker 1: to get up to full speed even on a flat 480 00:30:45,760 --> 00:30:50,680 Speaker 1: or or declined surface. So the Carnegie Melon team was 481 00:30:50,720 --> 00:30:53,440 Speaker 1: really curious. They were wondering what the heck was going wrong, 482 00:30:54,040 --> 00:30:56,320 Speaker 1: and they tried to figure out what happened at the 483 00:30:56,400 --> 00:30:58,800 Speaker 1: end of the race. They weren't really able to nail 484 00:30:58,800 --> 00:31:01,720 Speaker 1: it down at that point. They checked the fuel. The 485 00:31:01,760 --> 00:31:05,400 Speaker 1: fuel was fine. It wasn't contaminated or old or anything 486 00:31:05,440 --> 00:31:07,640 Speaker 1: like that. The oil was fine. The engine appeared to 487 00:31:07,680 --> 00:31:10,720 Speaker 1: work just fine from a cold start, and it wouldn't 488 00:31:10,720 --> 00:31:14,520 Speaker 1: be more. It would take more than a decade before 489 00:31:14,520 --> 00:31:17,000 Speaker 1: they figured out what had gone wrong, and it was 490 00:31:17,120 --> 00:31:20,200 Speaker 1: just by luck that they figured it out. Now, this 491 00:31:20,280 --> 00:31:23,160 Speaker 1: is going to include a spoiler, but it's a spoiler 492 00:31:23,200 --> 00:31:25,160 Speaker 1: for something that happened in two thousand seven, So I 493 00:31:25,160 --> 00:31:29,040 Speaker 1: don't think it's really that big a deal. In Carnegie 494 00:31:29,080 --> 00:31:33,040 Speaker 1: Mellon University was celebrating the tenth anniversary of the team's 495 00:31:33,240 --> 00:31:36,880 Speaker 1: DARPA Urban Challenge win from two thousand seven, and I'll 496 00:31:36,920 --> 00:31:39,480 Speaker 1: be talking about the DARPA Urban Challenge in our next 497 00:31:39,520 --> 00:31:43,160 Speaker 1: episode well. As part of this celebration, they brought out 498 00:31:43,320 --> 00:31:47,760 Speaker 1: Sandstorm and Highlander from storage. The two vehicles had been 499 00:31:48,400 --> 00:31:52,000 Speaker 1: stored away, but they still existed while they were doing it. 500 00:31:52,600 --> 00:31:55,320 Speaker 1: Spencer Spiker, who was part of the team working on this, 501 00:31:56,120 --> 00:32:00,240 Speaker 1: was running some diagnostics on the Highlander engine it was 502 00:32:00,360 --> 00:32:03,560 Speaker 1: running at the time, and as it was going through 503 00:32:03,560 --> 00:32:06,880 Speaker 1: this process, he was leaning against the vehicle and the 504 00:32:06,880 --> 00:32:09,800 Speaker 1: engine began to sputter, and when he moved away, the 505 00:32:09,840 --> 00:32:12,960 Speaker 1: engine started to pick up again, and he realized that 506 00:32:13,040 --> 00:32:16,320 Speaker 1: there was a little box that if there was pressure 507 00:32:16,360 --> 00:32:18,880 Speaker 1: put on that little box, it started to cause the 508 00:32:18,880 --> 00:32:21,520 Speaker 1: engine to die. That little box turned out to be 509 00:32:21,560 --> 00:32:24,479 Speaker 1: a filter that sat between the car's engine control module 510 00:32:24,880 --> 00:32:27,520 Speaker 1: and the fuel injectors, and he found out that if 511 00:32:27,560 --> 00:32:30,000 Speaker 1: there were any pressure put on that filter, it was 512 00:32:30,040 --> 00:32:33,440 Speaker 1: causing the engine to lose power. And leading up to 513 00:32:33,480 --> 00:32:36,600 Speaker 1: the Grand Challenge, there was an incident that may have 514 00:32:37,640 --> 00:32:42,280 Speaker 1: caused this problem. The Highlander had been going through a 515 00:32:43,280 --> 00:32:47,520 Speaker 1: uh kind of a routine training exercise, and as it 516 00:32:47,600 --> 00:32:50,600 Speaker 1: was doing so, it got too far over to one 517 00:32:50,720 --> 00:32:53,560 Speaker 1: edge of a sloped path and it slid off that 518 00:32:53,600 --> 00:32:58,240 Speaker 1: sloped path and it actually flipped over. Now, apparently that accident, 519 00:32:58,240 --> 00:33:01,080 Speaker 1: which appeared to be minor, it didn't seem like it 520 00:33:01,200 --> 00:33:05,120 Speaker 1: had caused that much damage, had actually bent this little box, 521 00:33:05,200 --> 00:33:08,840 Speaker 1: this filter in such a way that because of its 522 00:33:08,840 --> 00:33:12,840 Speaker 1: contact with the engine, was causing failure under the right conditions, 523 00:33:12,840 --> 00:33:15,040 Speaker 1: such as when pressure was put on it or when 524 00:33:15,040 --> 00:33:18,880 Speaker 1: it would expand from engine heat. And so the Highlander, 525 00:33:19,360 --> 00:33:21,760 Speaker 1: after it would start to heat up, would start to 526 00:33:21,880 --> 00:33:26,080 Speaker 1: lose performance, and it lost time during the Grand Challenge, 527 00:33:26,080 --> 00:33:29,240 Speaker 1: about forty minutes worth of time according to most estimates, 528 00:33:29,560 --> 00:33:31,440 Speaker 1: which would have made all the difference. It would have 529 00:33:31,480 --> 00:33:34,120 Speaker 1: come in first place by twenty minutes if that had 530 00:33:34,160 --> 00:33:37,880 Speaker 1: not happened. So it may have won if it hadn't 531 00:33:37,920 --> 00:33:41,800 Speaker 1: been for that accident that had happened earlier. Making things 532 00:33:41,800 --> 00:33:45,200 Speaker 1: even more juicy is that the leader of the Standard 533 00:33:45,200 --> 00:33:49,800 Speaker 1: team that did win was Sebastian Throne, who had previously 534 00:33:49,880 --> 00:33:54,080 Speaker 1: served as a member of the faculty at Carnegie Mellon University. 535 00:33:54,120 --> 00:33:57,240 Speaker 1: He was In fact, a colleague of the head of 536 00:33:57,320 --> 00:34:01,640 Speaker 1: CMUs team read Whittaker, So the Red team's Red Whittaker, 537 00:34:02,200 --> 00:34:05,640 Speaker 1: the Carnegie Mellon University team leader used to work with 538 00:34:05,680 --> 00:34:08,759 Speaker 1: the guy who led the Stanford team, So there was 539 00:34:08,800 --> 00:34:12,479 Speaker 1: some rivalry going on there, although from what I've read, 540 00:34:12,600 --> 00:34:14,880 Speaker 1: it sounds like it was all very good natured rivalry. 541 00:34:14,920 --> 00:34:17,520 Speaker 1: It wasn't like it was super bitter or anything. Now, 542 00:34:17,520 --> 00:34:21,520 Speaker 1: the nice thing is no one seems particularly mad that 543 00:34:21,640 --> 00:34:24,120 Speaker 1: this happened. It's the sort of thing that can happen 544 00:34:24,160 --> 00:34:28,360 Speaker 1: with mechanical systems in general. And ultimately, Carnegie Mellon University 545 00:34:28,400 --> 00:34:31,279 Speaker 1: would do very well in the next Grand Challenge. But 546 00:34:31,320 --> 00:34:33,480 Speaker 1: that's a story we're going to cover in our next episode. 547 00:34:34,360 --> 00:34:39,000 Speaker 1: Darpest statement was pretty darn positive. Quote. The Grand Challenge 548 00:34:39,120 --> 00:34:44,839 Speaker 1: stimulated the creation of a new community of innovators, inventors, mechanics, 549 00:34:45,080 --> 00:34:49,680 Speaker 1: computer scientists, engineers, and students who typically have not been 550 00:34:49,719 --> 00:34:54,520 Speaker 1: involved in defense related activities. The camaraderie and competitiveness that 551 00:34:54,600 --> 00:34:57,160 Speaker 1: have been the hallmark of the Grand Challenge since its 552 00:34:57,160 --> 00:35:02,080 Speaker 1: inception demonstrates that America's heritage of ingenuity and resourcefulness is 553 00:35:02,280 --> 00:35:06,080 Speaker 1: strong end quote. Now, in our next episode, we're gonna 554 00:35:06,120 --> 00:35:09,040 Speaker 1: look at the Urban Grand Challenge of two thousand seven 555 00:35:09,400 --> 00:35:13,320 Speaker 1: and how these competitions lead into the autonomous car environment 556 00:35:13,360 --> 00:35:16,040 Speaker 1: that we are in today. And we'll also talk about 557 00:35:16,080 --> 00:35:19,120 Speaker 1: some of the most difficult challenges we face, both technical 558 00:35:19,239 --> 00:35:23,000 Speaker 1: and ethical. And we'll also talk about the pros and 559 00:35:23,080 --> 00:35:26,200 Speaker 1: cons of driver list cars in our upcoming episodes, so 560 00:35:26,239 --> 00:35:27,960 Speaker 1: I hope you look forward to those. It's been a 561 00:35:27,960 --> 00:35:30,320 Speaker 1: lot of fun kind of going back through the history 562 00:35:30,440 --> 00:35:34,120 Speaker 1: and seeing what has led up to what we see today. 563 00:35:34,400 --> 00:35:38,040 Speaker 1: It's really fascinating to me to see the the transition 564 00:35:38,080 --> 00:35:42,120 Speaker 1: away from a world where all of the autonomous nature 565 00:35:42,200 --> 00:35:46,040 Speaker 1: of cars is built into the infrastructure surrounding cars to 566 00:35:46,239 --> 00:35:50,560 Speaker 1: one where it's largely built into the vehicles themselves, and 567 00:35:50,680 --> 00:35:55,400 Speaker 1: mostly because the Department of Defense needed innovation in that 568 00:35:55,520 --> 00:35:58,680 Speaker 1: space in order to meet a deadline to have one 569 00:35:58,719 --> 00:36:02,839 Speaker 1: third of all ground military combat vehicles automated by two 570 00:36:03,239 --> 00:36:06,480 Speaker 1: fift we'll talk more about that as well in an 571 00:36:06,560 --> 00:36:09,960 Speaker 1: upcoming episode. For the time being, if you guys have 572 00:36:10,000 --> 00:36:13,400 Speaker 1: any suggestions for future episodes, or you've got any suggestions 573 00:36:13,400 --> 00:36:16,120 Speaker 1: for people I should have on the show, or maybe 574 00:36:16,480 --> 00:36:18,759 Speaker 1: there's just a topic that you've always wanted to know 575 00:36:18,840 --> 00:36:21,600 Speaker 1: about Send me a message. The email address for the 576 00:36:21,640 --> 00:36:26,200 Speaker 1: show is tech Stuff at how stuff works dot com. 577 00:36:26,320 --> 00:36:29,280 Speaker 1: Or pop on over to our website that's tech Stuff 578 00:36:29,400 --> 00:36:33,560 Speaker 1: podcast dot com. You'll find other ways to contact me there. 579 00:36:33,960 --> 00:36:36,080 Speaker 1: You can reach out on social I'd love to hear 580 00:36:36,120 --> 00:36:38,520 Speaker 1: from you, and don't forget to go to our store. 581 00:36:38,600 --> 00:36:42,120 Speaker 1: It's at t public dot com slash tech stuff. Every 582 00:36:42,160 --> 00:36:44,480 Speaker 1: purchase you make goes to help the show. We've got 583 00:36:44,560 --> 00:36:47,920 Speaker 1: lots of cool designs on there. I'd love you to 584 00:36:48,080 --> 00:36:51,040 Speaker 1: check those out. I think I'm really proud of them. 585 00:36:51,040 --> 00:36:53,439 Speaker 1: I'm really happy with the way they've turned out, and 586 00:36:53,760 --> 00:36:55,640 Speaker 1: check them out if you If you like something and 587 00:36:55,680 --> 00:36:57,640 Speaker 1: you buy it, like I said, some of that money 588 00:36:57,680 --> 00:37:00,000 Speaker 1: goes to help our show and we greatly appreciate it. 589 00:37:00,600 --> 00:37:08,680 Speaker 1: And I'll talk to you again really soon for more 590 00:37:08,719 --> 00:37:11,040 Speaker 1: on this and thousands of other topics because it how 591 00:37:11,080 --> 00:37:21,960 Speaker 1: stuff works dot com.