1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,920 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,960 --> 00:00:17,239 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,320 --> 00:00:19,239 Speaker 1: How Stuff Works and my Heart Radio and I love 5 00:00:19,360 --> 00:00:23,720 Speaker 1: all things tech. And this is the third episode in 6 00:00:23,760 --> 00:00:27,160 Speaker 1: a series about driver less cars. In the first two episodes, 7 00:00:27,200 --> 00:00:31,120 Speaker 1: I covered the early concepts, which mostly involved building out 8 00:00:31,160 --> 00:00:35,120 Speaker 1: an infrastructure that's outside of cars, like a smart super 9 00:00:35,240 --> 00:00:37,440 Speaker 1: highway that would guide traffic to where it would need 10 00:00:37,479 --> 00:00:40,000 Speaker 1: to go. And I also talked about the first two 11 00:00:40,080 --> 00:00:43,839 Speaker 1: Grand Challenges that were sponsored by DARPA, that's the R 12 00:00:43,880 --> 00:00:47,520 Speaker 1: and D division of the Department of Defense. Darpest goal 13 00:00:47,680 --> 00:00:52,600 Speaker 1: was to create an incentive for engineers, researchers, computer scientists 14 00:00:52,600 --> 00:00:56,440 Speaker 1: to to innovate in the space in that autonomous vehicle space, 15 00:00:56,960 --> 00:01:00,160 Speaker 1: because the United States Congress had decided that a third 16 00:01:00,200 --> 00:01:04,280 Speaker 1: of all military ground combat vehicles should be autonomous by 17 00:01:04,319 --> 00:01:08,760 Speaker 1: the year and progress was just not going at the 18 00:01:08,800 --> 00:01:12,160 Speaker 1: speed that was needed to make that happen. So dark 19 00:01:12,240 --> 00:01:16,559 Speaker 1: As assessment was that the usual defense contractors just weren't 20 00:01:16,600 --> 00:01:19,760 Speaker 1: making progress fast enough that that they needed to have 21 00:01:19,920 --> 00:01:24,240 Speaker 1: some other way of getting new ideas, new blood into 22 00:01:24,280 --> 00:01:25,960 Speaker 1: all of this, So the Grand Challenge was a way 23 00:01:25,959 --> 00:01:29,240 Speaker 1: to invite numerous other engineers to solve these really difficult 24 00:01:29,280 --> 00:01:33,679 Speaker 1: engineering problems. The two thousand four challenge saw no winner. 25 00:01:34,080 --> 00:01:36,840 Speaker 1: No one was able to take home the prize money. 26 00:01:36,880 --> 00:01:39,959 Speaker 1: The best performing team made it less than seven and 27 00:01:40,000 --> 00:01:43,600 Speaker 1: a half miles through a one forty two mile course. 28 00:01:43,920 --> 00:01:46,759 Speaker 1: But the two thousand five challenge turned out much better. 29 00:01:46,920 --> 00:01:51,360 Speaker 1: Five teams completed the full course and one Stanford's racing 30 00:01:51,400 --> 00:01:55,520 Speaker 1: team took the top prize that year, which was more 31 00:01:55,560 --> 00:01:58,640 Speaker 1: than a million dollars. Now, there was no question that 32 00:01:58,720 --> 00:02:03,880 Speaker 1: the challenges had encouraged invention and innovation, from creative computer 33 00:02:03,920 --> 00:02:07,360 Speaker 1: algorithms that would guide an autonomous car through decisions on 34 00:02:07,440 --> 00:02:11,320 Speaker 1: wind and how to act, to designing all new hardware 35 00:02:11,600 --> 00:02:14,720 Speaker 1: that would be used to sense the environment. These challenges 36 00:02:14,840 --> 00:02:19,640 Speaker 1: had pushed technological development considerably. But the driverless cars of 37 00:02:19,680 --> 00:02:23,840 Speaker 1: two thousand five were meant to cross a desert course 38 00:02:24,160 --> 00:02:27,960 Speaker 1: sort of an off road course. There was tricky terrain 39 00:02:28,040 --> 00:02:29,960 Speaker 1: to cross, and the cars had to make some really 40 00:02:30,000 --> 00:02:32,600 Speaker 1: complex maneuvers in a few cases, but there was no 41 00:02:32,720 --> 00:02:35,560 Speaker 1: need to compensate for the types of stuff that your 42 00:02:35,600 --> 00:02:39,920 Speaker 1: typical human driver encounters every day. Namely, there was no 43 00:02:40,000 --> 00:02:43,760 Speaker 1: need to worry about traffic laws or traffic for that matter, 44 00:02:44,200 --> 00:02:45,760 Speaker 1: and it was important for the cars to be able 45 00:02:45,760 --> 00:02:48,359 Speaker 1: to detect obstacles in order to work their way around them, 46 00:02:48,360 --> 00:02:51,400 Speaker 1: but there was less worry about more specific challenges like 47 00:02:51,720 --> 00:02:55,919 Speaker 1: navigating through a city that has lots of potential moving 48 00:02:56,120 --> 00:03:01,320 Speaker 1: obstacles pedestrians, bicyclists, other cars. So for the next challenge, 49 00:03:01,360 --> 00:03:04,360 Speaker 1: which would take place in two thousand seven, DARPA wanted 50 00:03:04,360 --> 00:03:06,519 Speaker 1: to put teams to the test and have them design 51 00:03:06,560 --> 00:03:10,320 Speaker 1: a car that can navigate through a simulated urban environment. 52 00:03:10,880 --> 00:03:13,400 Speaker 1: Just as the agency had increased the prize money from 53 00:03:13,440 --> 00:03:16,720 Speaker 1: two thousand four to two thousand and five, the the 54 00:03:16,760 --> 00:03:19,359 Speaker 1: prize kitty was a million dollars for two thousand four, 55 00:03:19,360 --> 00:03:21,800 Speaker 1: it was two million for two thousand five. The urban 56 00:03:21,919 --> 00:03:24,200 Speaker 1: version of the challenge up to it again to three 57 00:03:24,240 --> 00:03:28,240 Speaker 1: and a half million dollars in prize money total. DARPA 58 00:03:28,360 --> 00:03:31,880 Speaker 1: offered two different routes to enter the Urban Challenge. You 59 00:03:31,919 --> 00:03:37,400 Speaker 1: could apply using one of two different approaches. They called 60 00:03:37,400 --> 00:03:41,720 Speaker 1: it Track A and Track B. A team submitting to 61 00:03:41,760 --> 00:03:46,040 Speaker 1: participate in Track A would be eligible to receive funding 62 00:03:46,240 --> 00:03:50,080 Speaker 1: from DARPA of up to a million dollars, but to qualify, 63 00:03:50,120 --> 00:03:52,440 Speaker 1: the team would have to submit a detailed proposal for 64 00:03:52,520 --> 00:03:56,400 Speaker 1: review and then, if selected, they would have to demonstrate 65 00:03:56,560 --> 00:04:00,840 Speaker 1: the progress of the experiment of the prototype at four 66 00:04:00,960 --> 00:04:04,920 Speaker 1: different milestone events in order to still be eligible for 67 00:04:05,040 --> 00:04:08,520 Speaker 1: that program. If the team could not meet the standards 68 00:04:08,560 --> 00:04:11,839 Speaker 1: that DARPA had set, then they would be removed from 69 00:04:11,880 --> 00:04:16,240 Speaker 1: Track A and all financial support would stop. Sixty five 70 00:04:16,360 --> 00:04:21,799 Speaker 1: teams submitted for Track A consideration, DARPA would select eleven 71 00:04:21,839 --> 00:04:26,600 Speaker 1: of them to participate. Track B teams were self supported, 72 00:04:26,800 --> 00:04:30,200 Speaker 1: meaning they were responsible for securing the funding necessary for 73 00:04:30,440 --> 00:04:34,920 Speaker 1: technology development, execution, that sort of stuff. Those teams could 74 00:04:34,920 --> 00:04:38,560 Speaker 1: seek out funding from sponsors and from other sources. So 75 00:04:38,680 --> 00:04:42,960 Speaker 1: Track B teams also had to participate in certain qualifying 76 00:04:43,040 --> 00:04:46,520 Speaker 1: events in order to continue the competition. There were site visits, 77 00:04:46,520 --> 00:04:49,160 Speaker 1: and then there was a big qualifier that a lot 78 00:04:49,200 --> 00:04:51,200 Speaker 1: of teams had to participate, and I'll talk about that 79 00:04:51,400 --> 00:04:54,400 Speaker 1: a little bit later. Now, those qualifiers were meant to 80 00:04:54,400 --> 00:04:57,760 Speaker 1: demonstrate that the respective teams vehicles could have a decent 81 00:04:57,880 --> 00:05:02,000 Speaker 1: chance of completing a competition's task on the actual day 82 00:05:02,000 --> 00:05:05,680 Speaker 1: of the final event. So It's really all about whittling 83 00:05:05,800 --> 00:05:10,440 Speaker 1: down the competitors to the ones most likely to achieve success. 84 00:05:11,360 --> 00:05:14,240 Speaker 1: Track B also had several other additional rules they had 85 00:05:14,279 --> 00:05:18,440 Speaker 1: to follow. This is from DARPA's actual official rule book 86 00:05:18,720 --> 00:05:21,720 Speaker 1: for the two thousand seven Urban Challenge. So, for example, 87 00:05:21,880 --> 00:05:25,560 Speaker 1: the team leader of a Track B team would be 88 00:05:26,040 --> 00:05:30,039 Speaker 1: ineligible to participate in any other Urban Challenge team. They 89 00:05:30,080 --> 00:05:33,279 Speaker 1: could only work on their own team that they were leading. 90 00:05:34,160 --> 00:05:37,680 Speaker 1: If you were not team leader, you could switch teams. 91 00:05:37,880 --> 00:05:40,159 Speaker 1: You could work on one team and then switch to 92 00:05:40,200 --> 00:05:42,520 Speaker 1: another one. Let's say that you're working on one team 93 00:05:42,560 --> 00:05:48,280 Speaker 1: and it is unable to meet the qualifications that are 94 00:05:48,320 --> 00:05:50,960 Speaker 1: necessary at a certain milestone, then you would be allowed 95 00:05:51,000 --> 00:05:54,559 Speaker 1: to switch to another team if another team wanted to. Also, 96 00:05:54,600 --> 00:05:55,960 Speaker 1: if you were a team leader, you had to be 97 00:05:55,960 --> 00:05:57,919 Speaker 1: at least twenty one years old, and you had to 98 00:05:57,920 --> 00:06:01,760 Speaker 1: be a US citizen. Non citizens could participate on teams, 99 00:06:01,800 --> 00:06:04,719 Speaker 1: but they could not be team leaders. This is another 100 00:06:04,760 --> 00:06:12,880 Speaker 1: reminder that ultimately this competition was keyed into national defense. 101 00:06:14,040 --> 00:06:17,640 Speaker 1: The spirit of the competition was very jovial from what 102 00:06:17,720 --> 00:06:21,719 Speaker 1: I can understand, it was very cooperative. But when you 103 00:06:21,800 --> 00:06:25,080 Speaker 1: take a step back and you look at the overall picture. 104 00:06:25,160 --> 00:06:29,679 Speaker 1: You remember, Oh, this is so that DARPA can develop 105 00:06:29,720 --> 00:06:33,840 Speaker 1: the technologies or can can encourage the development of the 106 00:06:33,839 --> 00:06:39,360 Speaker 1: technologies that will be needed to power the future ground 107 00:06:39,440 --> 00:06:44,240 Speaker 1: combat vehicles. Unlike the previous Grand Challenges, DARPA even allowed 108 00:06:44,279 --> 00:06:47,920 Speaker 1: teams to secure funding from government sources the previous ones. 109 00:06:47,960 --> 00:06:50,160 Speaker 1: If you listen to my last episode, you know they 110 00:06:50,160 --> 00:06:53,680 Speaker 1: said you couldn't get money from federal sources, but all 111 00:06:53,720 --> 00:06:57,400 Speaker 1: such funding had to be explicitly approved and authorized by 112 00:06:57,440 --> 00:07:01,480 Speaker 1: whatever the respective government department was for the use of 113 00:07:01,520 --> 00:07:05,839 Speaker 1: this Grand Challenge. Government sponsored teams were, however, prevented from 114 00:07:05,920 --> 00:07:10,320 Speaker 1: using any technology or information that was under any kind 115 00:07:10,320 --> 00:07:14,400 Speaker 1: of classified status. So you couldn't use any top secret 116 00:07:14,480 --> 00:07:19,040 Speaker 1: stuff because that was against the rules. I mentioned briefly 117 00:07:19,440 --> 00:07:22,280 Speaker 1: what the overall goal of the Urban Challenge was, this 118 00:07:22,360 --> 00:07:26,560 Speaker 1: idea of maneuvering through a simulated urban environment. But I 119 00:07:26,600 --> 00:07:29,160 Speaker 1: think it's more helpful if we actually go through all 120 00:07:29,240 --> 00:07:32,200 Speaker 1: the elements that DARPA spelled out in the official rules. 121 00:07:32,240 --> 00:07:35,880 Speaker 1: This is what was sent to any interested party that 122 00:07:36,000 --> 00:07:39,000 Speaker 1: was applying to be part of the Urban Challenge. This 123 00:07:39,080 --> 00:07:42,920 Speaker 1: illustrates exactly how challenging the whole thing ended up being 124 00:07:43,040 --> 00:07:45,680 Speaker 1: and the specific objectives of the Grand Challenge were and 125 00:07:45,760 --> 00:07:49,600 Speaker 1: these are all quotes from the rule book. Complete a 126 00:07:49,600 --> 00:07:53,840 Speaker 1: mission defined by an ordered series of checkpoints in a 127 00:07:54,000 --> 00:07:57,760 Speaker 1: complex route network. The vehicle will have five minutes to 128 00:07:57,800 --> 00:08:02,880 Speaker 1: process a mission description, but for attempting the course, interpret 129 00:08:03,080 --> 00:08:07,840 Speaker 1: static lane markings as in white or yellow lines provided 130 00:08:07,880 --> 00:08:12,080 Speaker 1: with the Route Network Definition file that's R in DF, 131 00:08:12,600 --> 00:08:16,800 Speaker 1: and behave in accordance with applicable traffic laws and conventions. 132 00:08:17,240 --> 00:08:19,400 Speaker 1: Darp as intent is for the R and d F 133 00:08:19,720 --> 00:08:23,960 Speaker 1: lane boundary descriptors to match the physical lane markings on 134 00:08:24,080 --> 00:08:27,440 Speaker 1: the ground. DARPA cannot ensure that this will be the 135 00:08:27,480 --> 00:08:30,240 Speaker 1: case in all areas, and as such, the R n 136 00:08:30,320 --> 00:08:33,920 Speaker 1: DF will take precedence over the physical ground markings in 137 00:08:34,000 --> 00:08:38,440 Speaker 1: conflicting areas. So, in other words, DARPA would provide to 138 00:08:38,520 --> 00:08:43,280 Speaker 1: each team a digital file that would represent the area 139 00:08:43,640 --> 00:08:49,320 Speaker 1: the features of the course. It wouldn't give information about 140 00:08:50,080 --> 00:08:54,280 Speaker 1: where anyone was expected to travel, but it would give 141 00:08:54,520 --> 00:08:57,920 Speaker 1: a layout of all the different roads in that course, 142 00:08:58,600 --> 00:09:04,240 Speaker 1: and according to that file, you would program your autonomous 143 00:09:04,240 --> 00:09:08,679 Speaker 1: cars behaviors to follow within those guidelines, and if the 144 00:09:08,720 --> 00:09:11,080 Speaker 1: car were to come upon part of the road that 145 00:09:11,200 --> 00:09:15,240 Speaker 1: was not in accordance with that same digital file, it 146 00:09:15,280 --> 00:09:17,880 Speaker 1: would defer to the digital file as opposed to the 147 00:09:17,920 --> 00:09:22,240 Speaker 1: actual conditions the road. So if in one case a 148 00:09:22,320 --> 00:09:26,680 Speaker 1: lane is is shown as being in one orientation and 149 00:09:26,720 --> 00:09:29,480 Speaker 1: it turns out it's slightly off in reality, you go 150 00:09:29,559 --> 00:09:33,360 Speaker 1: with the file, not with what reality is. This reminds 151 00:09:33,400 --> 00:09:36,600 Speaker 1: us that this is in fact a simulation, because that 152 00:09:36,679 --> 00:09:39,520 Speaker 1: does not fly in a real world situation. You have 153 00:09:39,600 --> 00:09:42,880 Speaker 1: to conform to what is really there for all drivers. 154 00:09:43,280 --> 00:09:47,520 Speaker 1: Back to the rules, the vehicles to exhibit context dependent 155 00:09:47,640 --> 00:09:52,280 Speaker 1: speed control to ensure safe operation, including adherents to speed limits. 156 00:09:52,640 --> 00:09:56,560 Speaker 1: It's to exhibit safe following behavior when approaching other vehicles 157 00:09:56,559 --> 00:09:59,680 Speaker 1: from behind in a traffic lane, which includes maintaining a 158 00:09:59,760 --> 00:10:04,160 Speaker 1: say following distance. Is to exhibit safe check and go 159 00:10:04,360 --> 00:10:07,760 Speaker 1: behavior when pulling around a stopped vehicle, pulling out of 160 00:10:07,760 --> 00:10:11,680 Speaker 1: a parking spot, moving through intersections, and in situations where 161 00:10:11,720 --> 00:10:14,640 Speaker 1: collision is possible, and is to stay on the road 162 00:10:14,960 --> 00:10:18,320 Speaker 1: and in a legal and appropriate travel lane while en route, 163 00:10:18,800 --> 00:10:24,080 Speaker 1: including around sharp turns, through intersections, and while passing. The 164 00:10:24,160 --> 00:10:28,480 Speaker 1: route definition or Route network definition file will specify the 165 00:10:28,520 --> 00:10:31,800 Speaker 1: GPS coordinates of the stop signs. The R and d 166 00:10:31,920 --> 00:10:34,840 Speaker 1: F specifies the location of stop lines on the ground. 167 00:10:35,280 --> 00:10:38,640 Speaker 1: On paved areas. Such stop lines will be represented by 168 00:10:38,679 --> 00:10:42,080 Speaker 1: a painted stop line on the pavement. Physical stop signs, however, 169 00:10:42,160 --> 00:10:45,319 Speaker 1: may or may not be present at the stop line locations. 170 00:10:46,000 --> 00:10:49,400 Speaker 1: Navigate safely in areas where GPS signals are partially or 171 00:10:49,559 --> 00:10:53,839 Speaker 1: entirely blocked. Uh there to follow paved and unpaved roads 172 00:10:53,880 --> 00:10:56,320 Speaker 1: and stay in the lane with very sparse or low 173 00:10:56,360 --> 00:11:00,960 Speaker 1: accuracy GPS way points very important because there for a 174 00:11:01,000 --> 00:11:06,600 Speaker 1: long time GPS receivers were of questionable reliability, especially if 175 00:11:06,600 --> 00:11:10,040 Speaker 1: you're going, like through a city with a lot of skyscrapers, 176 00:11:10,040 --> 00:11:13,600 Speaker 1: for example, you might have signals blocked and then you 177 00:11:13,640 --> 00:11:18,040 Speaker 1: would lose connectivity with GPS receivers. This still happens to 178 00:11:18,120 --> 00:11:20,560 Speaker 1: this day, but not as frequently, at least not in 179 00:11:20,559 --> 00:11:24,400 Speaker 1: my experience. It turns out that um whether there are 180 00:11:25,440 --> 00:11:28,680 Speaker 1: receivers that are using other systems to supplement that data, 181 00:11:29,240 --> 00:11:32,480 Speaker 1: or the actual receivers are just more sensitive, it seems 182 00:11:32,480 --> 00:11:36,079 Speaker 1: to work more reliably these days. The vehicle was to 183 00:11:36,200 --> 00:11:39,199 Speaker 1: change lanes safely when legal and appropriate, such as when 184 00:11:39,240 --> 00:11:42,040 Speaker 1: passing a vehicle or entering an opposing traffic lane to 185 00:11:42,080 --> 00:11:46,040 Speaker 1: pass a stopped vehicle. Vehicles must not pass other vehicles 186 00:11:46,200 --> 00:11:50,079 Speaker 1: queued at an intersection. Very important. You can go around 187 00:11:50,080 --> 00:11:52,400 Speaker 1: the car that's stopped in the middle of the road 188 00:11:52,440 --> 00:11:56,760 Speaker 1: because of some possible mechanical failure. But you can't go 189 00:11:56,800 --> 00:11:59,040 Speaker 1: around the car just because it's come to a stop 190 00:11:59,080 --> 00:12:01,400 Speaker 1: at an intersection, because it might be waiting for its 191 00:12:01,400 --> 00:12:05,080 Speaker 1: turn to go. Merge safely with traffic moving in one 192 00:12:05,240 --> 00:12:08,440 Speaker 1: or more lanes. After stopping at an intersection, the vehicle 193 00:12:08,480 --> 00:12:10,400 Speaker 1: would be able to pull across one lane of moving 194 00:12:10,440 --> 00:12:13,400 Speaker 1: traffic to merge with moving traffic in the opposing lane. 195 00:12:14,000 --> 00:12:16,800 Speaker 1: It was supposed to stop safely within one meter of 196 00:12:16,840 --> 00:12:19,600 Speaker 1: the stop line at a stop signed intersection and proceed 197 00:12:19,679 --> 00:12:23,640 Speaker 1: without excessive delay, so less than ten seconds delay according 198 00:12:23,679 --> 00:12:27,920 Speaker 1: to intersection precedents rules. It was to exhibit proper que 199 00:12:28,000 --> 00:12:31,840 Speaker 1: behavior at an intersection, including stopping at a safe distance 200 00:12:31,880 --> 00:12:34,760 Speaker 1: from other vehicles and stop and go procession to the 201 00:12:34,800 --> 00:12:38,720 Speaker 1: stop line without excessive delay. It was to navigate a 202 00:12:38,760 --> 00:12:42,680 Speaker 1: destination toward a destination in a large open area where 203 00:12:42,720 --> 00:12:45,920 Speaker 1: minimal or no GPS points are provided, as in loading 204 00:12:45,920 --> 00:12:49,440 Speaker 1: dock areas or parking lots. These areas may contain fixed 205 00:12:49,480 --> 00:12:53,960 Speaker 1: obstacles such as parked vehicles and moving obstacles, including other vehicles. 206 00:12:54,480 --> 00:12:57,120 Speaker 1: It was to safely pull into and back out of 207 00:12:57,160 --> 00:13:00,640 Speaker 1: a specified parking space in a parking lot. It was 208 00:13:00,720 --> 00:13:04,160 Speaker 1: to safely execute one or more three point turning maneuvers 209 00:13:04,160 --> 00:13:07,760 Speaker 1: to affect a U turn. And it was to dynamically 210 00:13:07,840 --> 00:13:11,160 Speaker 1: replan and execute the route to a destination if the 211 00:13:11,240 --> 00:13:14,839 Speaker 1: primary route is blocked or impassable. These are all things 212 00:13:14,880 --> 00:13:17,840 Speaker 1: that human drivers can do, and these are all things 213 00:13:17,880 --> 00:13:20,680 Speaker 1: that human drivers do on a regular basis. But as 214 00:13:20,679 --> 00:13:23,120 Speaker 1: you sit there and think about what it's needed to 215 00:13:23,880 --> 00:13:26,560 Speaker 1: program a machine to be able to do these things, 216 00:13:26,640 --> 00:13:30,160 Speaker 1: you start to recognize the challenges here because you have 217 00:13:30,200 --> 00:13:34,120 Speaker 1: to be able to detect these scenarios and then you 218 00:13:34,120 --> 00:13:37,480 Speaker 1: have to be able to react appropriately based upon the 219 00:13:37,520 --> 00:13:41,360 Speaker 1: information that's available to you. So it's not just you know, 220 00:13:41,520 --> 00:13:44,720 Speaker 1: if this, then that there are a lot of other considerations, 221 00:13:45,559 --> 00:13:48,840 Speaker 1: and it gives you a hint at the challenge that 222 00:13:48,880 --> 00:13:51,520 Speaker 1: was ahead of these teams. However, DARPA did point out 223 00:13:51,559 --> 00:13:54,439 Speaker 1: a few things that were outside the scope of this 224 00:13:54,520 --> 00:13:57,520 Speaker 1: challenge that teams would not need to worry about for 225 00:13:57,559 --> 00:14:01,640 Speaker 1: the purposes of this competition that included the recognition of 226 00:14:01,679 --> 00:14:05,040 Speaker 1: external traffic signals like traffic lights and stop signs through 227 00:14:05,080 --> 00:14:08,160 Speaker 1: the use of sensors. You didn't have to worry about that. 228 00:14:08,360 --> 00:14:13,360 Speaker 1: So because they were providing information about where stop signs 229 00:14:13,400 --> 00:14:17,319 Speaker 1: were based on GPS coordinates, you could program that directly 230 00:14:17,320 --> 00:14:20,760 Speaker 1: into your system, so that your car quote unquote knows 231 00:14:21,440 --> 00:14:24,360 Speaker 1: to stop at a certain point because it quote unquote 232 00:14:24,440 --> 00:14:27,479 Speaker 1: knows there's a stop sign, there doesn't have to detect it. 233 00:14:27,480 --> 00:14:30,520 Speaker 1: It's been programmed with that info, so there was no 234 00:14:30,560 --> 00:14:33,080 Speaker 1: need to develop any kind of optical system that would 235 00:14:33,120 --> 00:14:36,760 Speaker 1: detect a stop sign or a traffic light. This works 236 00:14:36,840 --> 00:14:41,240 Speaker 1: great if you're operating in a very limited area, right 237 00:14:41,280 --> 00:14:44,920 Speaker 1: in this case, in this simulated urban environment, if you're 238 00:14:45,120 --> 00:14:49,200 Speaker 1: operating in something where you've got to find border, and 239 00:14:49,280 --> 00:14:53,560 Speaker 1: you know quote unquote everything about that's everything that's inside 240 00:14:53,560 --> 00:14:56,640 Speaker 1: that border, you can program this kind of stuff into 241 00:14:56,640 --> 00:15:02,240 Speaker 1: your vehicle directly. But this approach it's more and more 242 00:15:02,280 --> 00:15:06,480 Speaker 1: impractical the larger your area of services. So while you 243 00:15:06,600 --> 00:15:08,880 Speaker 1: might be able to program this in for an area 244 00:15:08,920 --> 00:15:13,360 Speaker 1: that might consist of a few simulated blocks of a city, 245 00:15:14,000 --> 00:15:15,960 Speaker 1: it doesn't really work so well for a full city. 246 00:15:16,440 --> 00:15:18,240 Speaker 1: You would have to go in and program the GPS 247 00:15:18,320 --> 00:15:22,000 Speaker 1: coordinates for every single stop sign and every single traffic signal. 248 00:15:22,280 --> 00:15:24,920 Speaker 1: Not only that, but for traffic signals, you would also 249 00:15:24,960 --> 00:15:27,040 Speaker 1: have to figure out a way for your vehicle to 250 00:15:27,280 --> 00:15:30,840 Speaker 1: understand when the signal had changed. But for the purposes 251 00:15:30,840 --> 00:15:34,080 Speaker 1: of the competition, they said this isn't necessary. They also 252 00:15:34,120 --> 00:15:37,600 Speaker 1: said it's not necessary to be able to recognize pedestrians 253 00:15:37,920 --> 00:15:41,760 Speaker 1: or build in pedestrian avoidance in your vehicle because there 254 00:15:41,800 --> 00:15:45,680 Speaker 1: were not going to be any pedestrians on the simulated course. Clearly, 255 00:15:45,720 --> 00:15:48,320 Speaker 1: that would also be something that would be necessary if 256 00:15:48,360 --> 00:15:51,800 Speaker 1: we were to use driver lest cars in everyday life, 257 00:15:51,800 --> 00:15:55,600 Speaker 1: but for the purposes of this challenge not not pertinent. 258 00:15:56,600 --> 00:16:00,520 Speaker 1: Also outside the scope or behaviors necessary for high a driving, 259 00:16:00,560 --> 00:16:03,040 Speaker 1: such as high speed passing or high speed merge at 260 00:16:03,080 --> 00:16:05,800 Speaker 1: an on ramp. Speed limits for the urban challenge will 261 00:16:05,840 --> 00:16:08,960 Speaker 1: be thirty miles per hour or less, because again, it 262 00:16:09,080 --> 00:16:12,440 Speaker 1: was supposed to be going through surface streets in a 263 00:16:12,520 --> 00:16:16,280 Speaker 1: city environment, not on a highway. Driving in difficult off 264 00:16:16,400 --> 00:16:19,000 Speaker 1: road terrain is outside the scope of the program. Off 265 00:16:19,120 --> 00:16:22,880 Speaker 1: Road navigation in an unpaved area, travel along roads with potholes, 266 00:16:23,160 --> 00:16:27,000 Speaker 1: and travel along a dirt road are within scope, so 267 00:16:28,000 --> 00:16:32,280 Speaker 1: no off road travel would be necessary, But they didn't 268 00:16:32,280 --> 00:16:35,040 Speaker 1: guarantee that all the roads would be in perfect condition 269 00:16:35,680 --> 00:16:39,680 Speaker 1: and that a vehicle might have some difficult terrain that 270 00:16:40,040 --> 00:16:44,080 Speaker 1: that a regular driver might encounter if uh, the area 271 00:16:44,160 --> 00:16:47,000 Speaker 1: they are in has different types of roads like gravel 272 00:16:47,080 --> 00:16:53,360 Speaker 1: roads and that kind of thing. No, uh, obviously we 273 00:16:53,400 --> 00:16:56,960 Speaker 1: would need more than what DARPA was rolling out to have, 274 00:16:57,200 --> 00:16:59,800 Speaker 1: like a driverless car that that a person could safely 275 00:17:00,000 --> 00:17:03,440 Speaker 1: eide in. But you have to crawl before you can walk. 276 00:17:04,680 --> 00:17:08,680 Speaker 1: We'll talk a little bit more about the Urban Challenge, 277 00:17:08,680 --> 00:17:11,760 Speaker 1: but first let's take a quick break to thank our sponsor. 278 00:17:19,280 --> 00:17:21,600 Speaker 1: All Right, so I've talked about the objective of the 279 00:17:21,720 --> 00:17:25,040 Speaker 1: Urban Challenge, but but what about the actual procedure. Well, 280 00:17:25,119 --> 00:17:27,960 Speaker 1: every team would receive a detailed layout of the test 281 00:17:28,000 --> 00:17:31,960 Speaker 1: area that would be that digital file I mentioned that 282 00:17:31,960 --> 00:17:34,480 Speaker 1: would have all the accessible roads, all the parts of 283 00:17:34,520 --> 00:17:37,520 Speaker 1: the course that the vehicles could drive through without penalty. 284 00:17:38,240 --> 00:17:41,399 Speaker 1: That would help teams define exactly where their vehicles could 285 00:17:41,480 --> 00:17:43,480 Speaker 1: and could not go. So this is sort of like 286 00:17:43,520 --> 00:17:46,920 Speaker 1: getting a really really good map of a specific part 287 00:17:46,960 --> 00:17:50,159 Speaker 1: of a city that you're working with there's no information 288 00:17:50,320 --> 00:17:53,920 Speaker 1: about the route on that map. It's just a map 289 00:17:53,960 --> 00:17:58,399 Speaker 1: of the area. Immediately before the test, vehicles would be 290 00:17:58,440 --> 00:18:01,600 Speaker 1: given a series of check points that they were told 291 00:18:01,640 --> 00:18:06,439 Speaker 1: they had to visit. The pathway between those checkpoints was undefined, however, 292 00:18:06,800 --> 00:18:10,560 Speaker 1: so the vehicles could actually plot their own course from 293 00:18:10,640 --> 00:18:13,119 Speaker 1: start to finish, which is kind of like getting in 294 00:18:13,160 --> 00:18:15,520 Speaker 1: a friends car and you say, hey, let's go to 295 00:18:15,600 --> 00:18:18,560 Speaker 1: that that pizza place that we like to go to, 296 00:18:18,960 --> 00:18:21,200 Speaker 1: and you find out that they take a different route 297 00:18:21,240 --> 00:18:25,160 Speaker 1: than you would to get to that specific destination. So 298 00:18:25,359 --> 00:18:27,680 Speaker 1: each vehicle would be able to plot its own route 299 00:18:27,720 --> 00:18:30,679 Speaker 1: that that made the most quote unquote sense to the 300 00:18:30,760 --> 00:18:33,800 Speaker 1: respective vehicle, and they'd have to do this a few times. 301 00:18:33,800 --> 00:18:38,000 Speaker 1: There'd be a few different checkpoints, but the actual uh 302 00:18:38,240 --> 00:18:40,840 Speaker 1: direction that the vehicle would travel would be up to 303 00:18:40,880 --> 00:18:44,359 Speaker 1: the vehicle. In addition, there could be road blockages on 304 00:18:44,400 --> 00:18:47,480 Speaker 1: the course and they would not be indicated by the 305 00:18:47,680 --> 00:18:50,399 Speaker 1: r n d F information. They wouldn't be on that map, 306 00:18:50,920 --> 00:18:52,600 Speaker 1: so you wouldn't get a map that said, oh, by 307 00:18:52,640 --> 00:18:55,639 Speaker 1: the way, this little road here is actually going to 308 00:18:55,720 --> 00:18:58,800 Speaker 1: be blocked so you can't go down it. And that 309 00:18:58,920 --> 00:19:01,199 Speaker 1: was important. They want to have the blockages there, but 310 00:19:01,240 --> 00:19:04,199 Speaker 1: they didn't want the teams to have advanced knowledge of 311 00:19:04,240 --> 00:19:07,679 Speaker 1: which routes were blocked. Just like a static map, you 312 00:19:07,680 --> 00:19:10,720 Speaker 1: wouldn't know if a road was temporarily closed because of 313 00:19:10,720 --> 00:19:13,639 Speaker 1: a wreck or flooding or whatever it might be. So 314 00:19:13,680 --> 00:19:16,760 Speaker 1: when you encounter that sort of obstacle, if you're a 315 00:19:16,840 --> 00:19:18,920 Speaker 1: human driver and let's say you've just got a paper map, 316 00:19:18,960 --> 00:19:23,000 Speaker 1: you don't have a GPS receiver, you're forced to reevaluate 317 00:19:23,040 --> 00:19:25,399 Speaker 1: your plan. You're forced to look at the map and 318 00:19:25,520 --> 00:19:28,320 Speaker 1: consult it and figure out a different way to get 319 00:19:28,359 --> 00:19:30,320 Speaker 1: to where you're going, and you have to change things 320 00:19:30,320 --> 00:19:32,760 Speaker 1: on the fly. Well, that would happen in the Urban 321 00:19:32,840 --> 00:19:37,199 Speaker 1: Challenge as well to these autonomous cars. In addition to 322 00:19:37,240 --> 00:19:40,840 Speaker 1: the Route Network Data File, DARPA would provide each team 323 00:19:40,880 --> 00:19:44,520 Speaker 1: with the Mission Data File MDF. This is the one 324 00:19:44,560 --> 00:19:47,520 Speaker 1: that would actually have the information about the checkpoints that 325 00:19:47,600 --> 00:19:50,280 Speaker 1: the vehicle was meant to visit, as well as information 326 00:19:50,280 --> 00:19:54,679 Speaker 1: about the minimum and maximum speed limits for each road segment, saying, 327 00:19:55,400 --> 00:19:58,119 Speaker 1: along this section of this road, you can go up 328 00:19:58,119 --> 00:20:00,560 Speaker 1: to thirty miles per hour. For example, on the a section, 329 00:20:00,840 --> 00:20:04,200 Speaker 1: twenty miles hours your maximum. That kind of thing and 330 00:20:04,680 --> 00:20:07,240 Speaker 1: as the rules had stated, DARPA expected all vehicles to 331 00:20:07,280 --> 00:20:10,240 Speaker 1: operate within those speed parameters for each road segment. Actually, 332 00:20:10,560 --> 00:20:15,159 Speaker 1: to be more specific, each car was to behave according 333 00:20:15,320 --> 00:20:23,080 Speaker 1: to the driving laws of California. Vehicles could be teleoperated, 334 00:20:23,520 --> 00:20:27,080 Speaker 1: that is, they could be controlled remotely, but only for 335 00:20:27,160 --> 00:20:30,000 Speaker 1: the purposes of staging the vehicle at the start of 336 00:20:30,040 --> 00:20:34,200 Speaker 1: the competition, so a team could take over manual control 337 00:20:34,320 --> 00:20:36,960 Speaker 1: of the vehicle in order to move it to its 338 00:20:37,040 --> 00:20:41,880 Speaker 1: starting point, but then they would have to disengage from control, 339 00:20:42,160 --> 00:20:44,600 Speaker 1: and at that point, each vehicle had to be under 340 00:20:44,680 --> 00:20:49,119 Speaker 1: complete autonomous control with no input from the teams until 341 00:20:49,160 --> 00:20:52,480 Speaker 1: the completion of the course or if the vehicle experienced 342 00:20:52,560 --> 00:20:55,720 Speaker 1: some sort of failure that would remove it from consideration. 343 00:20:56,280 --> 00:20:58,119 Speaker 1: All vehicles had to be built on top of a 344 00:20:58,200 --> 00:21:01,879 Speaker 1: chassis that had a documented safety record, so you couldn't 345 00:21:01,960 --> 00:21:05,320 Speaker 1: just build an entirely new vehicle on a custom chassis 346 00:21:05,600 --> 00:21:09,119 Speaker 1: and be eligible for the competition. Most teams would retrofit 347 00:21:09,240 --> 00:21:13,240 Speaker 1: existing vehicles for their entries. In addition, all vehicles had 348 00:21:13,320 --> 00:21:17,000 Speaker 1: specific parameters they had to fall inside, including weight requirements. 349 00:21:17,160 --> 00:21:19,600 Speaker 1: Um they had to be at least two thousand pounds 350 00:21:20,240 --> 00:21:23,680 Speaker 1: that's about nine seven rams. They could be no heavier 351 00:21:23,720 --> 00:21:28,760 Speaker 1: than a whopping thirty thousand pounds or thirteen thousand, six 352 00:21:28,880 --> 00:21:33,879 Speaker 1: hundred krams. Now, for reference, the heaviest consumer car on 353 00:21:33,920 --> 00:21:37,040 Speaker 1: the market that I could find is the Mercedes may 354 00:21:37,080 --> 00:21:41,560 Speaker 1: BACHS six hundred Pullman Guard at eleven thousand, two hundred 355 00:21:41,680 --> 00:21:46,520 Speaker 1: forty four pounds or five thousand one ms. Still very 356 00:21:46,600 --> 00:21:49,200 Speaker 1: shy of that thirty thousand mark. If you want to 357 00:21:49,240 --> 00:21:52,879 Speaker 1: go with pickup trucks, that's the heaviest class of commercial, 358 00:21:53,000 --> 00:21:55,639 Speaker 1: well consumer vehicle. I should say not commercial. I mean 359 00:21:55,680 --> 00:21:58,880 Speaker 1: you've got massive dump trucks and stuff. They're super heavy. 360 00:21:59,040 --> 00:22:00,359 Speaker 1: But if you wanted to go out and by the 361 00:22:00,359 --> 00:22:03,320 Speaker 1: heaviest pickup truck out there, you would look for the 362 00:22:03,400 --> 00:22:07,359 Speaker 1: International x T, which weighs in at fourteen thousand, fifty 363 00:22:07,400 --> 00:22:12,280 Speaker 1: one pounds or six thousand six d. You're still less 364 00:22:12,320 --> 00:22:16,280 Speaker 1: than half of what the upper weight limit is at 365 00:22:16,320 --> 00:22:20,320 Speaker 1: that point. All other heavy consumer vehicles like SUVs and 366 00:22:20,400 --> 00:22:24,080 Speaker 1: vans are actually in between those two extremes. So it 367 00:22:24,119 --> 00:22:26,720 Speaker 1: would have to be a really hefty vehicle to max 368 00:22:26,840 --> 00:22:30,600 Speaker 1: out the weight limit of DARPA. But this reminds us 369 00:22:30,640 --> 00:22:33,840 Speaker 1: again the ultimate golfer DARPA is to develop technology to 370 00:22:33,880 --> 00:22:38,639 Speaker 1: devote toward automating military combat ground vehicles, which tend to 371 00:22:38,640 --> 00:22:42,000 Speaker 1: be a bit on the hefty side. In addition, the 372 00:22:42,119 --> 00:22:45,800 Speaker 1: vehicles had to have a minimum wheelbase of seventy two inches, 373 00:22:45,800 --> 00:22:48,639 Speaker 1: which is about one point eight meters from the front 374 00:22:48,640 --> 00:22:51,879 Speaker 1: axle to the back axle, and it could have a 375 00:22:51,920 --> 00:22:55,560 Speaker 1: maximum width of nine feet or two point seven four 376 00:22:55,680 --> 00:23:00,520 Speaker 1: meters and a maximum height of twelve feet or about 377 00:23:00,560 --> 00:23:04,080 Speaker 1: three point seven meters. Vehicles had to be able to 378 00:23:04,119 --> 00:23:08,240 Speaker 1: move autonomously, both forward and in reverse. They had to 379 00:23:08,280 --> 00:23:09,920 Speaker 1: be able to make that you turn. They had to 380 00:23:09,920 --> 00:23:12,480 Speaker 1: be able to turn on a typical urban street which 381 00:23:12,480 --> 00:23:15,840 Speaker 1: would be about thirty feet wide or nine meters, and 382 00:23:15,920 --> 00:23:19,360 Speaker 1: not climb up on the curb on either side. They 383 00:23:19,359 --> 00:23:23,520 Speaker 1: had to be vehicles that traveled on tires. There it 384 00:23:23,560 --> 00:23:26,200 Speaker 1: could be no treads, no tracks, nothing like that. Had 385 00:23:26,240 --> 00:23:29,960 Speaker 1: to be on tires. Uh. They were not to damage 386 00:23:30,000 --> 00:23:32,480 Speaker 1: the surface of a street with their passage, so it 387 00:23:32,520 --> 00:23:36,040 Speaker 1: couldn't be a means of getting around that could actually 388 00:23:36,040 --> 00:23:39,119 Speaker 1: tear up the road. And all the vehicles, sensors, and 389 00:23:39,160 --> 00:23:42,560 Speaker 1: technologies had to be self contained. The teams would not 390 00:23:42,600 --> 00:23:45,280 Speaker 1: be allowed to set up any sort of additional sensors 391 00:23:45,359 --> 00:23:49,200 Speaker 1: in the area to aid in the vehicle navigation or operation. 392 00:23:49,800 --> 00:23:53,640 Speaker 1: All data processing would similarly have to take place inside 393 00:23:53,680 --> 00:23:56,000 Speaker 1: the vehicle, So it was against the rules for the 394 00:23:56,080 --> 00:24:00,119 Speaker 1: sensors to send data to some external computer for processing 395 00:24:00,440 --> 00:24:03,760 Speaker 1: and then have that external computer beam back instructions. And 396 00:24:03,800 --> 00:24:06,640 Speaker 1: when you consider the fact that vehicles need to make 397 00:24:07,160 --> 00:24:12,720 Speaker 1: split second decisions to avoid a potential accident, you probably 398 00:24:12,760 --> 00:24:15,320 Speaker 1: come to the same conclusion that this is the most 399 00:24:15,400 --> 00:24:18,639 Speaker 1: logical approach. You don't want to insert delay if you 400 00:24:18,640 --> 00:24:21,960 Speaker 1: can avoid it. It was also against the rules to 401 00:24:21,960 --> 00:24:24,720 Speaker 1: operate a vehicle that was a hazard to its environment. 402 00:24:24,920 --> 00:24:28,760 Speaker 1: DARPA stated that except for the quote normal byproducts of 403 00:24:28,840 --> 00:24:32,159 Speaker 1: power generation end quote, vehicles would not be allowed to 404 00:24:32,240 --> 00:24:36,520 Speaker 1: jettison any other material from them. So you couldn't do 405 00:24:36,600 --> 00:24:39,399 Speaker 1: anything other than put out the normal kind of exhaust 406 00:24:39,840 --> 00:24:43,840 Speaker 1: that vehicles tend to put out. Actually, DARPA specifically prohibited 407 00:24:43,960 --> 00:24:50,280 Speaker 1: quote smoke screen or any other obscurant intentionally discharged end quote. 408 00:24:51,040 --> 00:24:52,560 Speaker 1: I guess they were keeping in mind that a lot 409 00:24:52,560 --> 00:24:55,119 Speaker 1: of people participating in the challenge had built robots for 410 00:24:55,240 --> 00:24:58,520 Speaker 1: robot battles, and maybe they were taking the competition parts 411 00:24:58,520 --> 00:25:01,600 Speaker 1: super serious, but I find that pretty amusing. The idea 412 00:25:01,680 --> 00:25:04,960 Speaker 1: that no no smoke screens, and it was specifically pointed 413 00:25:04,960 --> 00:25:07,840 Speaker 1: out in the rules. Teams were allowed to create a 414 00:25:07,880 --> 00:25:11,439 Speaker 1: backup vehicle to the one day intended to race, and 415 00:25:11,560 --> 00:25:14,760 Speaker 1: that way, if the primary vehicle should suffer some sort 416 00:25:14,800 --> 00:25:18,320 Speaker 1: of setback on race day, the backup vehicle could take 417 00:25:18,320 --> 00:25:21,800 Speaker 1: its place, but they had to be absolutely identical in 418 00:25:21,920 --> 00:25:25,560 Speaker 1: operation and in systems, and DARPA would put any second 419 00:25:25,640 --> 00:25:29,760 Speaker 1: vehicle through the same inspection and safety demonstration processes as 420 00:25:29,760 --> 00:25:32,879 Speaker 1: they would the primary vehicle. All vehicles had to have 421 00:25:32,960 --> 00:25:37,359 Speaker 1: a wireless emergency stop system built into them. An emergency 422 00:25:37,359 --> 00:25:41,080 Speaker 1: stop was the only permitted outside interference DARPA would allow 423 00:25:41,359 --> 00:25:44,080 Speaker 1: during the final event, and it would mean removing the 424 00:25:44,160 --> 00:25:46,760 Speaker 1: vehicle from consideration if you had to activate it. But 425 00:25:46,880 --> 00:25:48,359 Speaker 1: if you're at the point when you need to hit 426 00:25:48,400 --> 00:25:51,400 Speaker 1: the emergency stop on a vehicle, you've pretty much concluded 427 00:25:51,440 --> 00:25:54,080 Speaker 1: that you are no longer in the running. You're just 428 00:25:54,080 --> 00:25:57,360 Speaker 1: trying to minimize damage at that point. So DARPA supplied 429 00:25:57,480 --> 00:26:00,640 Speaker 1: every team entering into the qualifier event with a government 430 00:26:00,680 --> 00:26:03,239 Speaker 1: owned e stop system, and it was up to the 431 00:26:03,240 --> 00:26:07,200 Speaker 1: teams to integrate that system into their respective vehicles and designs. 432 00:26:07,840 --> 00:26:11,200 Speaker 1: As for wireless signals in general, DARPA did allow vehicles 433 00:26:11,200 --> 00:26:14,480 Speaker 1: to receive wireless signals, but really only for the purposes 434 00:26:14,800 --> 00:26:19,600 Speaker 1: of position determination. So like GPS satellite data, they could 435 00:26:19,600 --> 00:26:23,480 Speaker 1: receive that, and vehicles could omit and sense signals as 436 00:26:23,560 --> 00:26:26,720 Speaker 1: part of sensing the environments, just using li dar, you know, 437 00:26:26,760 --> 00:26:31,639 Speaker 1: the laser based variant of what radar is. But teams 438 00:26:31,640 --> 00:26:35,000 Speaker 1: would not be allowed to communicate with the vehicles and say, hey, hey, 439 00:26:35,040 --> 00:26:37,000 Speaker 1: you want to turn left up ahead. You couldn't do that. 440 00:26:37,480 --> 00:26:39,800 Speaker 1: The equipment on the vehicle had to be able to 441 00:26:39,840 --> 00:26:42,879 Speaker 1: do one other really important task. The systems on board 442 00:26:42,880 --> 00:26:45,679 Speaker 1: each vehicle had to be able to accept a USB 443 00:26:45,760 --> 00:26:49,240 Speaker 1: two point oh flash drive, because that's how DARPA would 444 00:26:49,240 --> 00:26:52,800 Speaker 1: transfer the mission data file over to the car's system. 445 00:26:52,960 --> 00:26:55,680 Speaker 1: Cars had to be able to go from the MDF 446 00:26:55,880 --> 00:26:59,159 Speaker 1: loading process to full autonomy within five minutes. That was 447 00:26:59,200 --> 00:27:02,760 Speaker 1: the five minutes start, so you get the MDF, the 448 00:27:02,960 --> 00:27:06,160 Speaker 1: timer would start ticking and your car had to leave 449 00:27:06,240 --> 00:27:09,280 Speaker 1: the starting area within five minutes. There are a few 450 00:27:09,280 --> 00:27:11,800 Speaker 1: other things that teams had to take it to consideration, 451 00:27:12,000 --> 00:27:14,560 Speaker 1: you know, I mentioned earlier that cars were not allowed 452 00:27:14,560 --> 00:27:16,800 Speaker 1: to stop for more than ten seconds and to stop 453 00:27:16,840 --> 00:27:21,320 Speaker 1: sign once you got through the process of making sure 454 00:27:21,760 --> 00:27:25,080 Speaker 1: the right of way was given. Uh. This was to 455 00:27:25,200 --> 00:27:27,760 Speaker 1: avoid the situation where someone gets at a stop and 456 00:27:27,800 --> 00:27:30,600 Speaker 1: they're afraid to go forward, they just keep waving other 457 00:27:30,680 --> 00:27:34,520 Speaker 1: drivers through. It's the equivalent of that. And uh, you know, 458 00:27:34,600 --> 00:27:36,919 Speaker 1: making sure that the car could still operate even if 459 00:27:36,960 --> 00:27:40,199 Speaker 1: GPS reception was lost, that sort of stuff. So we 460 00:27:40,280 --> 00:27:44,040 Speaker 1: understand what the parameters were. How did it all turn out? Well, 461 00:27:44,040 --> 00:27:46,680 Speaker 1: I'll tell you, but first I'm gonna take a sip 462 00:27:46,680 --> 00:27:57,399 Speaker 1: of tea and thank our sponsor. After all the applications, 463 00:27:57,800 --> 00:28:01,399 Speaker 1: DARPA selected fifty three teams to move into the next 464 00:28:01,440 --> 00:28:04,640 Speaker 1: phase of the competition, and out of those fifty three teams, 465 00:28:05,160 --> 00:28:08,879 Speaker 1: DARPA would authorize thirty six of them to participate in 466 00:28:08,920 --> 00:28:13,840 Speaker 1: the National Qualification Event, the n QE. That's kind of 467 00:28:13,880 --> 00:28:17,160 Speaker 1: the precursor to the final race, and like the final race, 468 00:28:17,520 --> 00:28:21,000 Speaker 1: the n QUE took place in a simulated urban environment. 469 00:28:21,760 --> 00:28:25,320 Speaker 1: DARPA would actually make use of a retired Air Force 470 00:28:25,359 --> 00:28:27,399 Speaker 1: base in California. Is an Air Force base that was 471 00:28:27,440 --> 00:28:31,160 Speaker 1: no longer in active use had been essentially abandoned, and 472 00:28:31,320 --> 00:28:35,080 Speaker 1: they ended up taking over it and turning a very 473 00:28:35,119 --> 00:28:39,200 Speaker 1: small part of this very large base into a couple 474 00:28:39,240 --> 00:28:43,280 Speaker 1: of city blocks or simulated city blocks. The n QUE 475 00:28:43,800 --> 00:28:47,880 Speaker 1: had three major components to it. In one, autonomous cars 476 00:28:47,880 --> 00:28:52,560 Speaker 1: were to essentially circle a block by making left turns 477 00:28:52,640 --> 00:28:55,479 Speaker 1: at each corner. So go down the street, stop at 478 00:28:55,480 --> 00:29:00,520 Speaker 1: an intersection, make a left, go down to the next intersections, stop, 479 00:29:00,680 --> 00:29:03,160 Speaker 1: make a left, etcetera, etcetera, and it would try and 480 00:29:03,200 --> 00:29:06,360 Speaker 1: circle the block as many times within a given amount 481 00:29:06,400 --> 00:29:11,600 Speaker 1: of time. Meanwhile, human drivers professional stunt drivers would be 482 00:29:11,680 --> 00:29:16,600 Speaker 1: driving along in both directions of traffic, so the car 483 00:29:16,640 --> 00:29:20,040 Speaker 1: would have to integrate itself into traffic, so not just 484 00:29:20,120 --> 00:29:23,560 Speaker 1: turning left, but turning left at a point that was 485 00:29:23,920 --> 00:29:27,880 Speaker 1: appropriate based upon the movement of traffic at that time. 486 00:29:28,280 --> 00:29:30,920 Speaker 1: Although all cars were supposed to behave as if it 487 00:29:30,920 --> 00:29:33,480 Speaker 1: were a four way stop is from from what I understand. 488 00:29:34,240 --> 00:29:37,160 Speaker 1: The next part of the inquee required cars to navigate 489 00:29:37,200 --> 00:29:42,400 Speaker 1: through a suburban environment and demonstrate that the basic functions 490 00:29:42,560 --> 00:29:46,480 Speaker 1: of an autonomous car navigation rerouting if it encounters an 491 00:29:46,480 --> 00:29:51,120 Speaker 1: obstacle parking that kind of stuff. Uh. The third part 492 00:29:51,240 --> 00:29:54,120 Speaker 1: required the vehicles to navigate through a four way stop 493 00:29:54,320 --> 00:29:57,320 Speaker 1: several times as human drivers would move through the area 494 00:29:57,720 --> 00:30:00,320 Speaker 1: at different speeds and that sort of thing, so to 495 00:30:00,360 --> 00:30:03,840 Speaker 1: make sure that the car was behaving consistently and safely 496 00:30:03,920 --> 00:30:07,680 Speaker 1: and in accordance with law. Now, the original concept had 497 00:30:07,760 --> 00:30:11,400 Speaker 1: DARPA selecting the top twenty teams from that event to 498 00:30:11,520 --> 00:30:14,920 Speaker 1: move onward into the final competition, but based on the 499 00:30:14,960 --> 00:30:18,520 Speaker 1: performance of the vehicles at the inn quee and some 500 00:30:18,640 --> 00:30:22,000 Speaker 1: concerns about what the safety issues might be if DARPA 501 00:30:22,080 --> 00:30:26,160 Speaker 1: were to put too many autonomous vehicles into the test 502 00:30:26,240 --> 00:30:30,520 Speaker 1: area at the same time, they ultimately selected only eleven 503 00:30:30,520 --> 00:30:34,240 Speaker 1: teams to continue on to the final event. They felt 504 00:30:34,280 --> 00:30:37,160 Speaker 1: that if they had gone with twenty, there may have 505 00:30:37,280 --> 00:30:40,760 Speaker 1: been too high a concentration of autonomous cars to human 506 00:30:40,760 --> 00:30:45,000 Speaker 1: controlled cars and that it would increase the likelihood of 507 00:30:45,040 --> 00:30:48,520 Speaker 1: accidents happening. So the final event of the two thousand 508 00:30:48,560 --> 00:30:52,400 Speaker 1: seven challenge happened on November third, two thousand seven. Teams 509 00:30:52,440 --> 00:30:56,200 Speaker 1: had six hours to complete the challenge objectives while human 510 00:30:56,320 --> 00:30:59,440 Speaker 1: drivers and the other challengers were also on the course, 511 00:30:59,760 --> 00:31:03,160 Speaker 1: So you had people stunt drivers driving vehicles, and you 512 00:31:03,240 --> 00:31:06,720 Speaker 1: had other autonomous cars on the course while your autonomous 513 00:31:06,720 --> 00:31:09,920 Speaker 1: car was trying to complete subjectives and all the same 514 00:31:10,200 --> 00:31:11,720 Speaker 1: at the same time, you have to obey all the 515 00:31:11,760 --> 00:31:16,200 Speaker 1: California traffic laws. Now, out of those eleven finalists, six 516 00:31:16,320 --> 00:31:19,360 Speaker 1: completed the challenge, and three of them in less than 517 00:31:19,480 --> 00:31:22,440 Speaker 1: six hours. Now that doesn't sound like a lot, but 518 00:31:22,480 --> 00:31:26,280 Speaker 1: it's actually an incredible achievement. Remember, in the two thousand 519 00:31:26,280 --> 00:31:29,600 Speaker 1: four Grand Challenge there were no winners. In the two 520 00:31:29,640 --> 00:31:34,240 Speaker 1: thousand five Grand Challenge, five teams were able to complete 521 00:31:34,320 --> 00:31:37,600 Speaker 1: the course. In the two thousand seven Challenge, there were 522 00:31:37,640 --> 00:31:42,120 Speaker 1: only eleven finalists in total, and more than half of them. 523 00:31:42,240 --> 00:31:46,160 Speaker 1: Six of them completed the course, three of them within 524 00:31:46,200 --> 00:31:49,920 Speaker 1: the time frame. Coming in first place was Carnegie Mellon 525 00:31:50,040 --> 00:31:54,160 Speaker 1: University's Boss vehicle, which had the reputation for being a 526 00:31:54,200 --> 00:31:58,240 Speaker 1: bit of an aggressive driver, but the CMU team said, well, yeah, 527 00:31:58,280 --> 00:32:02,040 Speaker 1: it's a race. A little less than twenty minutes after 528 00:32:02,200 --> 00:32:05,720 Speaker 1: Boss's final task, after it crossed the finish line, the 529 00:32:05,840 --> 00:32:09,880 Speaker 1: Stanford Racing Team's Junior vehicle finished. So this was a 530 00:32:09,920 --> 00:32:13,440 Speaker 1: flip of the two thousand five Grand Challenge, where Stanford's 531 00:32:13,440 --> 00:32:16,280 Speaker 1: team came in first and then c m US team 532 00:32:16,320 --> 00:32:19,600 Speaker 1: came in second and third in third place of the 533 00:32:19,600 --> 00:32:23,560 Speaker 1: two thousand seven challenge was Virginia Tech's vehicle Odin. That 534 00:32:23,600 --> 00:32:26,600 Speaker 1: one finished less than ten minutes after Stanford's vehicle, so 535 00:32:26,760 --> 00:32:30,160 Speaker 1: it was hot on the the tail of Stanford's car. 536 00:32:30,960 --> 00:32:33,280 Speaker 1: The other three that finished were M I. T. S. 537 00:32:33,400 --> 00:32:36,880 Speaker 1: Tallos vehicle which ended right around the six hour time limit, 538 00:32:37,360 --> 00:32:40,960 Speaker 1: and then the little Ben Toyota Prius from the Ben 539 00:32:41,080 --> 00:32:45,440 Speaker 1: Franklin Racing team that had team members from the University 540 00:32:45,440 --> 00:32:49,880 Speaker 1: of Pennsylvania and Lehigh University from Philadelphia, Pennsylvania. And then 541 00:32:49,920 --> 00:32:54,280 Speaker 1: you had in the sixth place Cornell's team which was 542 00:32:54,320 --> 00:32:59,920 Speaker 1: called sky Net really Cute Cornell. The Cornell team in 543 00:33:00,040 --> 00:33:03,600 Speaker 1: the Philadelphia team, their their cars finished after the six 544 00:33:03,640 --> 00:33:07,360 Speaker 1: hours had passed. Of the five vehicles that got a 545 00:33:07,480 --> 00:33:10,680 Speaker 1: d n F as in did not finish, one of 546 00:33:10,720 --> 00:33:15,000 Speaker 1: them collided with another vehicle in a minor collision, two 547 00:33:15,000 --> 00:33:18,680 Speaker 1: of them ran into stationary objects, and two of them 548 00:33:18,800 --> 00:33:22,160 Speaker 1: froze after hitting either an intersection or a traffic circle, 549 00:33:22,520 --> 00:33:25,840 Speaker 1: and so they were disqualified. The outcome of the Urban 550 00:33:25,960 --> 00:33:30,040 Speaker 1: Challenge had a lasting legacy. While the autonomous cars that competed, 551 00:33:30,560 --> 00:33:33,680 Speaker 1: we're still far too limited to function on normal roads. 552 00:33:33,680 --> 00:33:36,880 Speaker 1: You would never want to unleash these on your average 553 00:33:36,880 --> 00:33:41,400 Speaker 1: city street. The incredible progress that had been made in 554 00:33:41,440 --> 00:33:45,280 Speaker 1: the field showed the driver less cars might actually be 555 00:33:45,440 --> 00:33:49,560 Speaker 1: a possibility, and maybe not that far off. But you 556 00:33:49,600 --> 00:33:53,960 Speaker 1: could also argue that this demonstration caused some people to 557 00:33:54,160 --> 00:33:59,160 Speaker 1: project an unrealistic expectation of when we might see autonomous 558 00:33:59,200 --> 00:34:02,440 Speaker 1: cars get a wide rollout on roads around the world. 559 00:34:02,880 --> 00:34:05,120 Speaker 1: And you can kind of understand where that would come from. 560 00:34:05,440 --> 00:34:08,879 Speaker 1: Right The two thousand five Grand Challenge was an off 561 00:34:09,000 --> 00:34:12,840 Speaker 1: road desert course. You didn't have to worry about making 562 00:34:12,840 --> 00:34:16,600 Speaker 1: your way through traffic or around stalled vehicles or anything 563 00:34:16,680 --> 00:34:19,840 Speaker 1: like that. The advance you would see in two thousand 564 00:34:19,960 --> 00:34:22,880 Speaker 1: seven suggested that, hey, maybe there's just a few tweaks 565 00:34:22,920 --> 00:34:25,240 Speaker 1: that need to happen, and then we're gonna have driverless 566 00:34:25,280 --> 00:34:28,680 Speaker 1: cars everywhere. But that ignores the fact that the challenges 567 00:34:28,719 --> 00:34:31,480 Speaker 1: were really, really difficult, and the teams that met those 568 00:34:31,560 --> 00:34:36,400 Speaker 1: challenges we're still struggling with some really tough problems, and 569 00:34:36,400 --> 00:34:39,319 Speaker 1: they didn't have to take into consideration stuff like pedestrians. 570 00:34:40,320 --> 00:34:43,120 Speaker 1: So it would take a lot more than just refining 571 00:34:43,160 --> 00:34:46,960 Speaker 1: this approach to make a vehicle roadworthy. Now, one of 572 00:34:47,000 --> 00:34:51,120 Speaker 1: the technologies I want to mention before I close is lidar. 573 00:34:51,239 --> 00:34:53,760 Speaker 1: I've I've mentioned it once or twice, but I really 574 00:34:53,760 --> 00:34:57,080 Speaker 1: want to take a moment to point it out because 575 00:34:57,120 --> 00:35:01,960 Speaker 1: it was one of the standout developments the various grand challenges. 576 00:35:02,520 --> 00:35:06,000 Speaker 1: So lighter works on the basic principle as radar. Right, 577 00:35:06,120 --> 00:35:09,520 Speaker 1: radar sends out radio signals and then there's a receiver, 578 00:35:09,920 --> 00:35:13,000 Speaker 1: and the receiver picks up the reflected radio signals, and 579 00:35:13,080 --> 00:35:16,200 Speaker 1: through measuring the time difference between when they were sent 580 00:35:16,239 --> 00:35:18,719 Speaker 1: out and when they were coming back, you can tell 581 00:35:18,760 --> 00:35:22,400 Speaker 1: how far away an object is. There's also Doppler shift 582 00:35:22,440 --> 00:35:24,279 Speaker 1: and stuff like that, but we've talked about that before, 583 00:35:24,320 --> 00:35:26,360 Speaker 1: so I'm not going to go into it here. Ldar 584 00:35:26,480 --> 00:35:28,920 Speaker 1: does the same sort of thing, but uses lasers instead 585 00:35:28,960 --> 00:35:32,560 Speaker 1: of radio waves. So you have a receiver, a sensor, 586 00:35:32,560 --> 00:35:35,239 Speaker 1: a photo cell essentially that's looking for that frequency of 587 00:35:35,320 --> 00:35:40,000 Speaker 1: light and can pick up those, uh, those reflections and 588 00:35:40,000 --> 00:35:43,919 Speaker 1: then be able to project on a digital map where 589 00:35:43,920 --> 00:35:46,760 Speaker 1: an object is. But the problem was that the early 590 00:35:46,920 --> 00:35:52,560 Speaker 1: use of lidar was very primitive, it was very limited. Uh. 591 00:35:52,640 --> 00:35:55,840 Speaker 1: They really just could be sent out in in a 592 00:35:56,040 --> 00:35:57,960 Speaker 1: single direction and you would know that there was an 593 00:35:58,000 --> 00:36:00,680 Speaker 1: object out there, but you weren't getting very high resolution 594 00:36:01,920 --> 00:36:05,840 Speaker 1: results from this. And that changed in two thousand five, 595 00:36:06,280 --> 00:36:08,719 Speaker 1: and really changed in two thousand seven thanks to a 596 00:36:08,719 --> 00:36:13,280 Speaker 1: guy named Dave Hall. Dave Hall had found out about 597 00:36:13,400 --> 00:36:17,920 Speaker 1: lidar from a guy named Jim McBride who worked with Ford. Now, 598 00:36:17,920 --> 00:36:21,160 Speaker 1: when Dave Hall looked into lidar, he found it really interesting. 599 00:36:21,440 --> 00:36:24,400 Speaker 1: And Dave Hall he was the founder of a company 600 00:36:24,840 --> 00:36:29,560 Speaker 1: um still is actually that was called Vladine still is 601 00:36:29,560 --> 00:36:31,600 Speaker 1: called Vladine, and it was in the business of doing 602 00:36:31,640 --> 00:36:36,400 Speaker 1: audio equipment. So he was getting bored with doing audio equipment. 603 00:36:36,440 --> 00:36:40,320 Speaker 1: He thought, well, I'll play with this autonomous car area 604 00:36:40,360 --> 00:36:43,360 Speaker 1: for a while. He had also participated in various robot 605 00:36:43,400 --> 00:36:49,239 Speaker 1: war challenges and competitions. So he goes and looks into 606 00:36:49,320 --> 00:36:51,920 Speaker 1: lidar and he thinks, well, this has got some potential, 607 00:36:52,160 --> 00:36:54,799 Speaker 1: but it's really limited based on how people are using 608 00:36:54,800 --> 00:36:57,680 Speaker 1: it right now. And while he hadn't even known what 609 00:36:57,760 --> 00:37:00,279 Speaker 1: lidar was, in two thousand four and two thousand five 610 00:37:00,320 --> 00:37:03,680 Speaker 1: he invented a lighter tool that would become incredibly important 611 00:37:03,680 --> 00:37:06,600 Speaker 1: for all the urban challenge teams. He had what was 612 00:37:06,640 --> 00:37:11,239 Speaker 1: the equivalent of sixty four lasers that were scanning outward, 613 00:37:11,680 --> 00:37:15,719 Speaker 1: and they were mounted on a spinning scanner, So the 614 00:37:15,760 --> 00:37:18,359 Speaker 1: scanner would spin in a circle and you would get 615 00:37:18,400 --> 00:37:23,040 Speaker 1: a three hundred sixty degree scan of the area around 616 00:37:23,040 --> 00:37:25,720 Speaker 1: a vehicle, and that would allow him to gather information 617 00:37:25,800 --> 00:37:28,920 Speaker 1: about all the objects surrounding a vehicle, their distance from 618 00:37:28,920 --> 00:37:32,040 Speaker 1: the vehicle, even whether or not those objects are moving 619 00:37:32,400 --> 00:37:35,839 Speaker 1: toward or away from the vehicle. Now, that wouldn't make 620 00:37:35,920 --> 00:37:38,239 Speaker 1: him win the two thou five challenge. He didn't, but 621 00:37:38,360 --> 00:37:40,960 Speaker 1: his tool got a lot of attention. People noticed, Hey, 622 00:37:40,960 --> 00:37:45,120 Speaker 1: this is a really cool technology. It's incredibly innovative. So 623 00:37:45,160 --> 00:37:47,840 Speaker 1: in two thousand and seven, a whole bunch of the 624 00:37:47,920 --> 00:37:50,440 Speaker 1: vehicles that were submitted as part of that Urban Challenge 625 00:37:50,680 --> 00:37:54,240 Speaker 1: sported a lightar scanner that was created by Hall's company, 626 00:37:54,320 --> 00:37:57,400 Speaker 1: Velo Dine, And in fact, Velo Dine is a leading 627 00:37:57,480 --> 00:38:04,320 Speaker 1: manufacturer of light oar systems speci scifically for autonomous car use. So, uh, 628 00:38:04,680 --> 00:38:08,520 Speaker 1: he became a big shot billionaire from that. It was 629 00:38:08,840 --> 00:38:12,239 Speaker 1: a really amazing development. Now, in the next episode, I'm 630 00:38:12,239 --> 00:38:14,239 Speaker 1: going to talk a little bit more about where driver 631 00:38:14,320 --> 00:38:19,120 Speaker 1: lest cars went after the Urban Challenge figuratively speaking, And 632 00:38:19,120 --> 00:38:22,879 Speaker 1: then on the episode following that, we're going to look 633 00:38:22,920 --> 00:38:27,920 Speaker 1: more into the philosophy that uh that Thomas cars are great, 634 00:38:28,280 --> 00:38:30,920 Speaker 1: and we're also gonna look at some opposing viewpoints, not 635 00:38:30,960 --> 00:38:35,520 Speaker 1: necessarily that a Thomas cars are bad, but perhaps suggesting 636 00:38:35,840 --> 00:38:39,680 Speaker 1: that we're not nearly as far along as some futurists 637 00:38:39,719 --> 00:38:42,120 Speaker 1: would like us to believe. So that's gonna be the 638 00:38:42,120 --> 00:38:44,040 Speaker 1: next couple of episodes, But for now we're going to 639 00:38:44,040 --> 00:38:46,520 Speaker 1: wrap this up. If you guys have suggestions for a 640 00:38:46,640 --> 00:38:49,880 Speaker 1: future show topics, whether it's for a single episode or 641 00:38:49,920 --> 00:38:52,400 Speaker 1: maybe it's an arc like the one where currently in 642 00:38:53,080 --> 00:38:55,160 Speaker 1: let me know. You can send me an email. The 643 00:38:55,160 --> 00:38:59,760 Speaker 1: address is tech stuff at how stuff works dot com, 644 00:38:59,880 --> 00:39:01,919 Speaker 1: or you can head on over to our website that's 645 00:39:02,160 --> 00:39:06,200 Speaker 1: tech Stuff podcast dot com and you can let us 646 00:39:06,320 --> 00:39:09,160 Speaker 1: know through the various links to our social media. Also, 647 00:39:09,239 --> 00:39:12,319 Speaker 1: don't forget our merchandise store over at t public dot 648 00:39:12,360 --> 00:39:15,960 Speaker 1: com slash tech stuff. Remember every purchase you make goes 649 00:39:16,000 --> 00:39:19,000 Speaker 1: to help the show. We greatly appreciate it, and I'll 650 00:39:19,040 --> 00:39:27,600 Speaker 1: talk to you again really soon for more on this 651 00:39:27,760 --> 00:39:30,280 Speaker 1: and thousands of other topics. Because it how stuff works 652 00:39:30,280 --> 00:39:40,600 Speaker 1: dot com