1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,680 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,720 --> 00:00:16,279 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,360 --> 00:00:18,479 Speaker 1: I Heart Radio and I love all things tech. And 5 00:00:18,520 --> 00:00:21,160 Speaker 1: if you haven't noticed already, yeah, my voice is all 6 00:00:21,239 --> 00:00:23,960 Speaker 1: sorts of jacked up because I'm getting over a cold. 7 00:00:24,040 --> 00:00:29,560 Speaker 1: I apologize for that, but the tech must go on now. 8 00:00:29,680 --> 00:00:32,879 Speaker 1: Just a couple of years ago, the tech world in 9 00:00:32,960 --> 00:00:37,720 Speaker 1: general was pretty optimistic about autonomous cars, and I include 10 00:00:37,760 --> 00:00:42,120 Speaker 1: myself in that group. I remember seeing the remarkable progress 11 00:00:42,159 --> 00:00:45,680 Speaker 1: that had come out from the first DARPA Grand Challenge 12 00:00:46,000 --> 00:00:49,800 Speaker 1: up to about I don't know ten or so, and 13 00:00:49,800 --> 00:00:51,680 Speaker 1: and it seemed like we were just on the verge 14 00:00:51,720 --> 00:00:55,360 Speaker 1: of having fleets of robo taxis at our back and call. 15 00:00:55,960 --> 00:00:59,040 Speaker 1: But now we've gone on for several more years and 16 00:00:59,040 --> 00:01:01,520 Speaker 1: we're still at a point are only a handful of 17 00:01:01,560 --> 00:01:05,920 Speaker 1: companies are conducting limited tests. Plus there have been some 18 00:01:06,160 --> 00:01:11,039 Speaker 1: high profile cases of accidents involving vehicles operating under autonomous 19 00:01:11,120 --> 00:01:15,560 Speaker 1: or semi autonomous modes that ended in tragedy. So in 20 00:01:15,560 --> 00:01:18,039 Speaker 1: this episode, we're going to take a look at autonomous 21 00:01:18,080 --> 00:01:21,479 Speaker 1: cars and where we stand today. Now, let's start with 22 00:01:21,840 --> 00:01:24,399 Speaker 1: I think it helps if we run through the levels 23 00:01:24,520 --> 00:01:28,080 Speaker 1: of autonomy, and not everyone uses these levels to talk 24 00:01:28,080 --> 00:01:31,360 Speaker 1: about autonomy, and to be honest, the barriers between levels 25 00:01:31,560 --> 00:01:34,520 Speaker 1: are a bit fuzzy, and sometimes we're not really able 26 00:01:34,560 --> 00:01:38,520 Speaker 1: to say where we are at as far as levels 27 00:01:38,560 --> 00:01:43,119 Speaker 1: of autonomy. We can look back at previous developments and say, 28 00:01:43,120 --> 00:01:45,680 Speaker 1: all right, well, judging on where we are now, we'd 29 00:01:45,680 --> 00:01:48,480 Speaker 1: say that this falls into level two or level three, 30 00:01:48,800 --> 00:01:51,120 Speaker 1: But it can be a little difficult to see what 31 00:01:51,240 --> 00:01:57,160 Speaker 1: level we are currently in without you know, truly remarkable evidence. 32 00:01:57,600 --> 00:01:59,720 Speaker 1: But in general, this is a useful way to talk 33 00:01:59,720 --> 00:02:03,840 Speaker 1: about how far along we are uh as far as 34 00:02:03,880 --> 00:02:08,959 Speaker 1: getting fully autonomous cars. So technically the levels range from 35 00:02:09,040 --> 00:02:13,520 Speaker 1: zero to five, so that means there are really six levels. However, 36 00:02:13,840 --> 00:02:17,040 Speaker 1: level zero really means there is no autonomy at all. 37 00:02:17,520 --> 00:02:19,920 Speaker 1: So with that type of vehicle, the human driver is 38 00:02:19,960 --> 00:02:24,720 Speaker 1: responsible for all operations of the vehicle. Every driving task 39 00:02:25,040 --> 00:02:28,520 Speaker 1: is handled by the driver alone. So some folks will 40 00:02:28,520 --> 00:02:32,120 Speaker 1: say that they're really just five levels of autonomy. Zero 41 00:02:32,160 --> 00:02:36,359 Speaker 1: would refer to vehicles that really, honestly, they don't exist 42 00:02:36,400 --> 00:02:39,880 Speaker 1: that much anymore. Now you might be thinking but hey, 43 00:02:40,120 --> 00:02:43,640 Speaker 1: Mr smarty Pants podcast person. I drive a car and 44 00:02:43,680 --> 00:02:46,919 Speaker 1: it doesn't have any autonomous vehicle features, but depending upon 45 00:02:46,919 --> 00:02:50,760 Speaker 1: whom you ask, features like power steering or anti lock 46 00:02:50,880 --> 00:02:56,000 Speaker 1: breaks or cruise control and other pretty common features fall 47 00:02:56,120 --> 00:02:59,360 Speaker 1: into the low level autonomous range. It doesn't mean your 48 00:02:59,440 --> 00:03:02,240 Speaker 1: car is at on this, but it has some of 49 00:03:02,280 --> 00:03:07,600 Speaker 1: the components that are identified with this concept of autonomy. 50 00:03:07,639 --> 00:03:12,079 Speaker 1: So most cars today are actually above level zero if 51 00:03:12,120 --> 00:03:15,519 Speaker 1: we go by that definition there level one are higher. 52 00:03:16,400 --> 00:03:19,440 Speaker 1: So level one autonomy would apply to cars where the 53 00:03:19,520 --> 00:03:22,399 Speaker 1: driver still controls the vehicle. The vehicle is still under 54 00:03:22,480 --> 00:03:26,400 Speaker 1: driver control, but the car has some driver assistance features 55 00:03:26,440 --> 00:03:29,639 Speaker 1: like power steering or antilock brakes. The car might have 56 00:03:29,720 --> 00:03:33,480 Speaker 1: what is called an advanced driver Assistance system or a 57 00:03:33,639 --> 00:03:37,240 Speaker 1: d a S, and the word advanced makes it sound 58 00:03:37,280 --> 00:03:40,320 Speaker 1: a bit fancier than it is at this particular level 59 00:03:40,360 --> 00:03:44,360 Speaker 1: of autonomy. The car might have systems that help people steer, 60 00:03:44,880 --> 00:03:48,640 Speaker 1: or it might have systems that help accelerate and or break, 61 00:03:49,120 --> 00:03:52,560 Speaker 1: but the steering and accelerating or steering and breaking can't 62 00:03:52,600 --> 00:03:56,800 Speaker 1: happen simultaneously. Either one or the other can be taken 63 00:03:56,840 --> 00:03:59,800 Speaker 1: over by these systems, but not both. At the same time, 64 00:04:00,080 --> 00:04:02,320 Speaker 1: not with level one. If we get up to level 65 00:04:02,320 --> 00:04:06,480 Speaker 1: to autonomy, then we're talking about partial automation. The A 66 00:04:06,600 --> 00:04:08,760 Speaker 1: d A s on these cars can do stuff like 67 00:04:08,840 --> 00:04:12,520 Speaker 1: control steering and breaking, or steering and accelerating at the 68 00:04:12,560 --> 00:04:16,040 Speaker 1: same time, at least under certain circumstances. But even in 69 00:04:16,080 --> 00:04:20,000 Speaker 1: those cases, the car's driver still remains primarily in control 70 00:04:20,040 --> 00:04:22,880 Speaker 1: of the vehicle. With this level of autonomy, a driver 71 00:04:23,000 --> 00:04:25,800 Speaker 1: would still not remove their hands from the wheel, as 72 00:04:25,800 --> 00:04:28,320 Speaker 1: the car would need the humans participation to you know, 73 00:04:28,880 --> 00:04:31,919 Speaker 1: works safely. So with level to autonomy, you still have 74 00:04:32,000 --> 00:04:34,000 Speaker 1: to have your attention on the road, you still have 75 00:04:34,000 --> 00:04:35,719 Speaker 1: to have your hands on the wheel. It's just that 76 00:04:35,760 --> 00:04:40,720 Speaker 1: the car can occasionally kick in and assist in various scenarios, 77 00:04:40,960 --> 00:04:45,720 Speaker 1: typically in very restricted cases. Now at level three autonomy, 78 00:04:45,800 --> 00:04:49,960 Speaker 1: we're getting up to conditional automation. These cars would still 79 00:04:49,960 --> 00:04:52,960 Speaker 1: require a human driver, but there can be times when 80 00:04:52,960 --> 00:04:55,600 Speaker 1: the car systems can operate the vehicle on its own 81 00:04:55,760 --> 00:04:59,799 Speaker 1: and the driver is essentially a passenger. During those moments. 82 00:05:00,200 --> 00:05:02,360 Speaker 1: The driver is still supposed to monitor the environment. They're 83 00:05:02,400 --> 00:05:04,320 Speaker 1: still supposed to be prepared to take over the car 84 00:05:04,520 --> 00:05:07,400 Speaker 1: should the vehicle indicate it needs to hand over control 85 00:05:07,480 --> 00:05:10,240 Speaker 1: to the person behind the wheel. So ideally there would 86 00:05:10,279 --> 00:05:13,400 Speaker 1: be a system where the car would identify a situation 87 00:05:13,440 --> 00:05:15,960 Speaker 1: in which the driver needs to take over, and then 88 00:05:16,200 --> 00:05:20,000 Speaker 1: well in advance of that situation becoming imminent, it would 89 00:05:20,040 --> 00:05:23,640 Speaker 1: alert the driver to take over control of the car. Uh. 90 00:05:23,720 --> 00:05:27,279 Speaker 1: This is trickier, right. This is harder to do than 91 00:05:27,360 --> 00:05:29,719 Speaker 1: it than to say, because the car would have to 92 00:05:29,800 --> 00:05:32,839 Speaker 1: know far enough in advance to be able to send 93 00:05:32,880 --> 00:05:34,960 Speaker 1: that alert to the driver, and the driver would have 94 00:05:35,000 --> 00:05:37,400 Speaker 1: to be able to respond to that. And while we 95 00:05:37,480 --> 00:05:42,840 Speaker 1: feel like our response time is really fast, in computational terms, 96 00:05:43,440 --> 00:05:47,920 Speaker 1: we are snails. We move super slow. So this is 97 00:05:47,920 --> 00:05:51,479 Speaker 1: actually pretty tricky, especially if you're talking about a dynamic 98 00:05:51,880 --> 00:05:55,160 Speaker 1: situation where things are changing very rapidly. At level three, 99 00:05:55,200 --> 00:05:58,800 Speaker 1: autonomous cars are supposed to uh do this seamlessly, and 100 00:05:58,880 --> 00:06:01,800 Speaker 1: as I said, that is a pretty tricky thing to 101 00:06:01,880 --> 00:06:06,679 Speaker 1: do technically. Most vehicle systems were looking at now, especially 102 00:06:06,720 --> 00:06:10,359 Speaker 1: the ones like Tesla's autopilot, falls somewhere in level three. 103 00:06:10,920 --> 00:06:13,760 Speaker 1: Level four autonomy is at a point where a vehicle 104 00:06:13,800 --> 00:06:17,320 Speaker 1: can automatically operate itself at least under certain conditions, but 105 00:06:17,400 --> 00:06:22,480 Speaker 1: not necessarily all driving conditions, the vehicle would likely include 106 00:06:22,520 --> 00:06:25,680 Speaker 1: the option for a human driver to take over operations, 107 00:06:25,960 --> 00:06:29,600 Speaker 1: but under normal, you know, conditions that the car would 108 00:06:29,600 --> 00:06:32,840 Speaker 1: pretty much drive itself. So with level for autonomy, you 109 00:06:32,880 --> 00:06:36,400 Speaker 1: would have self driving cars that could act as a 110 00:06:36,400 --> 00:06:39,840 Speaker 1: self driving car for most of the time, but also 111 00:06:39,960 --> 00:06:42,240 Speaker 1: allow a human driver to take over if the human 112 00:06:42,320 --> 00:06:46,800 Speaker 1: driver wanted to. UM. Level five autonomy is a fully 113 00:06:46,839 --> 00:06:51,200 Speaker 1: autonomous car. The car can operate itself under all driving conditions, 114 00:06:51,240 --> 00:06:53,640 Speaker 1: So any condition where a human would be driving a car, 115 00:06:54,200 --> 00:06:57,160 Speaker 1: a level five autonomous car should be able to operate 116 00:06:57,160 --> 00:07:00,719 Speaker 1: in that same situation. There may not be any steering 117 00:07:00,720 --> 00:07:04,200 Speaker 1: wheel or any controls at all in a vehicle, meaning 118 00:07:04,200 --> 00:07:07,120 Speaker 1: there's no option for a human driver to take over. 119 00:07:07,360 --> 00:07:10,760 Speaker 1: Now that's not our prerequisite. You can have a level 120 00:07:10,880 --> 00:07:14,400 Speaker 1: five fully autonomous car that would still have controls and 121 00:07:14,440 --> 00:07:18,760 Speaker 1: still allow humans to take over manually if they if 122 00:07:18,800 --> 00:07:22,440 Speaker 1: they chose to do so. It's just that it's an option, 123 00:07:22,560 --> 00:07:25,040 Speaker 1: it's not. It's no longer a mandatory thing to have 124 00:07:25,200 --> 00:07:28,800 Speaker 1: those human based controls with a level five autonomous car. 125 00:07:29,440 --> 00:07:32,880 Speaker 1: UM So we don't have any of these yet, So 126 00:07:33,000 --> 00:07:36,760 Speaker 1: really talking about this is purely in the hypothetical. Arguably, 127 00:07:36,840 --> 00:07:40,080 Speaker 1: we have some that are in the level four range, 128 00:07:40,600 --> 00:07:44,120 Speaker 1: but there will get to that they're under very strict parameters, 129 00:07:44,320 --> 00:07:47,960 Speaker 1: all right, So most experts agree that the versions of 130 00:07:48,000 --> 00:07:50,960 Speaker 1: autonomous cars we've seen so far are mainly in the 131 00:07:51,040 --> 00:07:54,960 Speaker 1: level three and level four categories, uh, creeping more toward 132 00:07:55,160 --> 00:07:57,679 Speaker 1: a firm level four. We're kind of in the early 133 00:07:57,800 --> 00:08:00,760 Speaker 1: stages of that, and there's several tests programs that are 134 00:08:00,960 --> 00:08:04,760 Speaker 1: operating almost as if we're at level five. But there's 135 00:08:04,920 --> 00:08:08,360 Speaker 1: disagreement about whether or not technology is really sophisticate enough 136 00:08:08,400 --> 00:08:12,440 Speaker 1: to warrant us calling any existing vehicle a level four 137 00:08:12,520 --> 00:08:16,600 Speaker 1: or level five autonomous car. And so, while we have 138 00:08:16,960 --> 00:08:19,280 Speaker 1: some examples of cars and I'll talk about a couple 139 00:08:19,320 --> 00:08:23,360 Speaker 1: of them that lack control systems for human drivers, they 140 00:08:23,400 --> 00:08:26,800 Speaker 1: are almost all prototypes and concept vehicles or in very 141 00:08:26,920 --> 00:08:31,240 Speaker 1: limited testing situations. Uh. And so therefore they don't really 142 00:08:31,320 --> 00:08:34,240 Speaker 1: rank as level five autonomous cars because while they lack 143 00:08:34,280 --> 00:08:38,640 Speaker 1: the controls, they cannot operate in every situation an environment 144 00:08:38,640 --> 00:08:42,880 Speaker 1: that humans drive in. So it's it's too early for 145 00:08:43,000 --> 00:08:45,120 Speaker 1: us to talk about deploying cars that have no way 146 00:08:45,160 --> 00:08:48,960 Speaker 1: to hand over control to human driver in all regions 147 00:08:49,000 --> 00:08:52,880 Speaker 1: and in all you know, driving situations. Okay, so let's 148 00:08:52,880 --> 00:08:55,959 Speaker 1: do a very quick rundown on the history of autonomous 149 00:08:55,960 --> 00:08:58,920 Speaker 1: cars up to say, like or so, and to see 150 00:08:58,920 --> 00:09:01,880 Speaker 1: why sub folks like like yours truly we're so bullish 151 00:09:02,040 --> 00:09:05,560 Speaker 1: on the future of autonomous cars. So the history stretches 152 00:09:05,760 --> 00:09:08,600 Speaker 1: back a good long ways, particularly if we're looking at 153 00:09:08,679 --> 00:09:12,360 Speaker 1: stuff like power steering. But that's getting way too granular. 154 00:09:12,480 --> 00:09:14,760 Speaker 1: I'm not going to do that. And the history is 155 00:09:14,800 --> 00:09:18,960 Speaker 1: also really complex, and that involves lots of different disciplines 156 00:09:19,040 --> 00:09:23,800 Speaker 1: converging into the autonomous car form factor. You have stuff 157 00:09:23,800 --> 00:09:29,480 Speaker 1: like robotics, you know, sensored development, artificial intelligence, computational processing, power, 158 00:09:29,720 --> 00:09:34,040 Speaker 1: range finding technology, lots of things that all have to 159 00:09:34,040 --> 00:09:37,520 Speaker 1: come together. And to really dive into the complex history 160 00:09:37,559 --> 00:09:40,360 Speaker 1: of all the technologies that are coming together to make 161 00:09:40,400 --> 00:09:44,520 Speaker 1: autonomous cars possible would require a whole mini series of episodes. 162 00:09:44,559 --> 00:09:46,840 Speaker 1: So we're not going to jump into all of that 163 00:09:46,920 --> 00:09:49,760 Speaker 1: in this one. Instead, I want to focus on things 164 00:09:49,800 --> 00:09:52,920 Speaker 1: like the DARPA challenges that were created in the mid 165 00:09:52,960 --> 00:09:55,959 Speaker 1: two thousand's. The first one was in two thousand four 166 00:09:56,480 --> 00:09:59,280 Speaker 1: and DARPA, as you'll recall, is the research and Development 167 00:09:59,360 --> 00:10:00,840 Speaker 1: Arm of the depart I'm in the defense in the 168 00:10:00,880 --> 00:10:06,480 Speaker 1: United States, so it's technically an organization that funds various 169 00:10:06,920 --> 00:10:10,719 Speaker 1: other groups to do R and D in technologies that 170 00:10:10,840 --> 00:10:14,920 Speaker 1: ultimately stand to benefit the defense of the United States. 171 00:10:15,320 --> 00:10:19,400 Speaker 1: So while there are other uses for those technologies that 172 00:10:19,600 --> 00:10:24,640 Speaker 1: don't directly relate to defense or military systems, that's the 173 00:10:24,640 --> 00:10:28,720 Speaker 1: primary purpose for DARBA. So in two thousand four they 174 00:10:28,760 --> 00:10:32,200 Speaker 1: created this this challenge. They called for teams to build 175 00:10:32,760 --> 00:10:37,040 Speaker 1: or convert vehicles into autonomous cars that were capable of 176 00:10:37,120 --> 00:10:41,000 Speaker 1: navigating a long distance desert course is more than a 177 00:10:41,080 --> 00:10:43,559 Speaker 1: hundred miles long, and there it needs to be no 178 00:10:43,640 --> 00:10:46,560 Speaker 1: human operators, so it could not be remotely controlled, nor 179 00:10:46,640 --> 00:10:48,920 Speaker 1: would there be a driver in the vehicle. And the 180 00:10:49,000 --> 00:10:51,880 Speaker 1: idea was design a car that would be capable of 181 00:10:52,160 --> 00:10:57,360 Speaker 1: traveling a predesignated route from beginning to end. For that 182 00:10:57,400 --> 00:11:00,000 Speaker 1: two thousand four challenge, no team was able to complete 183 00:11:00,000 --> 00:11:04,000 Speaker 1: eat the challenge. Uh cars failed. Some of them went 184 00:11:04,040 --> 00:11:06,880 Speaker 1: off road and got stuck, some of them just got 185 00:11:06,920 --> 00:11:11,560 Speaker 1: confused and stopped. So no one completed it within the 186 00:11:11,600 --> 00:11:14,600 Speaker 1: time frame that DARPA had set. But it's set the 187 00:11:14,640 --> 00:11:18,360 Speaker 1: stage for subsequent competitions. In two thousand five, DARPA held 188 00:11:18,400 --> 00:11:21,480 Speaker 1: another Grand Challenge again with the desert course. This one 189 00:11:21,559 --> 00:11:24,440 Speaker 1: was a hundred thirty two miles long and this time 190 00:11:24,520 --> 00:11:27,520 Speaker 1: five teams were able to complete the route and the 191 00:11:27,520 --> 00:11:30,959 Speaker 1: winning team was from Stanford Race. Stanford University was the 192 00:11:31,000 --> 00:11:34,880 Speaker 1: Stanford Racing team. They clocked the shortest time on the course. 193 00:11:35,280 --> 00:11:38,400 Speaker 1: By shortest time, I'm still talking about a long time. 194 00:11:39,240 --> 00:11:42,760 Speaker 1: It took them six hours fifty three minutes to make 195 00:11:42,800 --> 00:11:46,360 Speaker 1: the one thirty two mile journey. That would mean that 196 00:11:46,400 --> 00:11:49,320 Speaker 1: the average speed for the vehicle when taken across the 197 00:11:49,360 --> 00:11:52,520 Speaker 1: whole course was somewhere around eighteen miles per hour or 198 00:11:52,520 --> 00:11:56,400 Speaker 1: approximately twenty nine kilometers per hour, which isn't exactly tearing 199 00:11:56,480 --> 00:11:59,200 Speaker 1: up the track, but it was still a very impressive achievement. 200 00:11:59,240 --> 00:12:01,640 Speaker 1: I don't want to take away from what they achieved. 201 00:12:01,960 --> 00:12:06,520 Speaker 1: It was incredible, especially for the time, but it's not 202 00:12:06,679 --> 00:12:08,920 Speaker 1: the sort of speed you would look at and think, oh, well, 203 00:12:08,960 --> 00:12:12,439 Speaker 1: this is the replacement for the modern car. The next 204 00:12:12,559 --> 00:12:15,480 Speaker 1: challenge would happen in two thousand seven, and it switched 205 00:12:15,520 --> 00:12:18,600 Speaker 1: things up by requiring teams to design a car capable 206 00:12:18,600 --> 00:12:23,040 Speaker 1: of navigating through a simulated urban environment complete with traffic 207 00:12:23,160 --> 00:12:26,160 Speaker 1: and traffic laws like you know, traffic lights and stop 208 00:12:26,240 --> 00:12:31,000 Speaker 1: signs and simulated pedestrians. It wouldn't be enough to design 209 00:12:31,040 --> 00:12:33,520 Speaker 1: a car that could detect a road and follow it, 210 00:12:33,840 --> 00:12:36,960 Speaker 1: or even a car capable of managing stuff like how 211 00:12:37,000 --> 00:12:41,080 Speaker 1: to how to send torque two different wheels. In order 212 00:12:41,080 --> 00:12:43,440 Speaker 1: to get out of a tricky situation, the cars would 213 00:12:43,520 --> 00:12:47,920 Speaker 1: need advanced collision detection and decision making capabilities. They have 214 00:12:47,960 --> 00:12:50,800 Speaker 1: to obey traffic laws, they'd have to be able to 215 00:12:50,920 --> 00:12:53,960 Speaker 1: adapt to potentially changing situations, the kind of stuff you 216 00:12:54,040 --> 00:12:58,280 Speaker 1: might find if you're driving around a city. So in 217 00:12:58,320 --> 00:13:02,320 Speaker 1: that case, six teams were able to finish the course, 218 00:13:02,640 --> 00:13:06,000 Speaker 1: Stanford Racing would actually take second place that time. They 219 00:13:06,080 --> 00:13:08,880 Speaker 1: clocked in at just under four and a half hours. 220 00:13:09,320 --> 00:13:12,360 Speaker 1: First place went to a group called Tartan Racing from 221 00:13:12,440 --> 00:13:17,839 Speaker 1: Carnegie Melon University and they finished in four hours ten minutes. Now, 222 00:13:17,840 --> 00:13:20,360 Speaker 1: the purpose of these competitions wasn't just to find out 223 00:13:20,520 --> 00:13:24,280 Speaker 1: which groups of smarty pants engineers were able to build 224 00:13:24,320 --> 00:13:27,200 Speaker 1: the best car. It was an attempt to kick start 225 00:13:27,320 --> 00:13:32,160 Speaker 1: serious development in the various fields related to making autonomous 226 00:13:32,200 --> 00:13:37,400 Speaker 1: cars a possibility. Engineers worked on all sorts of different designs. 227 00:13:37,440 --> 00:13:41,719 Speaker 1: Some incorporated lots of optical cameras. Some used lie dar, 228 00:13:42,000 --> 00:13:44,760 Speaker 1: which is a type of laser based range finding technology 229 00:13:44,880 --> 00:13:47,719 Speaker 1: similar to radar. So it works by zapping out a 230 00:13:47,800 --> 00:13:51,600 Speaker 1: laser and then detecting any reflections coming back from that 231 00:13:51,720 --> 00:13:55,840 Speaker 1: laser light. It uses an array of sensors looking for 232 00:13:55,880 --> 00:13:59,000 Speaker 1: any evidence of that laser light coming back to the sensor, 233 00:13:59,520 --> 00:14:02,360 Speaker 1: and then measures the time difference between when the laser 234 00:14:02,720 --> 00:14:07,439 Speaker 1: went out and when it picked up the reflection, and then, 235 00:14:07,559 --> 00:14:10,320 Speaker 1: working with some math, it can figure out how far 236 00:14:10,400 --> 00:14:13,960 Speaker 1: away an obstacle is from the vehicle. Not only that, 237 00:14:14,000 --> 00:14:16,240 Speaker 1: it can also figure out whether or not that obstacle 238 00:14:16,679 --> 00:14:19,080 Speaker 1: is moving, or if it's stationary, or if it's moving 239 00:14:19,120 --> 00:14:21,640 Speaker 1: away from or toward the vehicle. It can figure out 240 00:14:21,640 --> 00:14:25,080 Speaker 1: all that. Uh, and I've talked about that in past episode, 241 00:14:25,080 --> 00:14:27,320 Speaker 1: so I won't get into the whull technical details here, 242 00:14:27,960 --> 00:14:30,920 Speaker 1: but it was one of those key components that's used 243 00:14:30,960 --> 00:14:35,360 Speaker 1: in some, but not all vehicles that are following under 244 00:14:35,360 --> 00:14:38,960 Speaker 1: this autonomous card development. It's interesting because there are lots 245 00:14:38,960 --> 00:14:41,880 Speaker 1: of different companies that are working on autonomous cars. They 246 00:14:41,880 --> 00:14:45,800 Speaker 1: are not all relying on exactly the same technologies to 247 00:14:45,880 --> 00:14:48,680 Speaker 1: achieve that goal. Some of them are much more heavily 248 00:14:48,720 --> 00:14:52,040 Speaker 1: focused on optical cameras. Some of them are more focused 249 00:14:52,040 --> 00:14:55,160 Speaker 1: on things like lidar and other sensors. Some of them 250 00:14:55,200 --> 00:14:59,720 Speaker 1: involve a whole, you know, slew of different technologies that 251 00:14:59,760 --> 00:15:02,800 Speaker 1: are meant to be both uh, you know, primary systems 252 00:15:02,800 --> 00:15:06,120 Speaker 1: and redundant systems. So it's really interesting and it was. 253 00:15:06,360 --> 00:15:08,960 Speaker 1: It was really impressive to see these teams complete the 254 00:15:09,040 --> 00:15:12,080 Speaker 1: Urban Challenge, but again, it didn't immediately make everyone think 255 00:15:12,160 --> 00:15:17,480 Speaker 1: driverless cars would be available right away. The challenges, while impressive, 256 00:15:18,240 --> 00:15:21,680 Speaker 1: didn't compare to what the average human driver deals with 257 00:15:21,800 --> 00:15:24,600 Speaker 1: on a on a regular day. The competition times were 258 00:15:24,600 --> 00:15:28,520 Speaker 1: pretty long. The average speeds were all below fifteen miles 259 00:15:28,520 --> 00:15:31,560 Speaker 1: per hour, so they're all below twenty four kilometers per hour. 260 00:15:31,880 --> 00:15:34,200 Speaker 1: At that speed, it was clear that these vehicles were 261 00:15:34,240 --> 00:15:39,200 Speaker 1: just the earliest incarnations of technologies that would power autonomous 262 00:15:39,240 --> 00:15:41,720 Speaker 1: cars in the future. So they were airing on the 263 00:15:41,720 --> 00:15:45,160 Speaker 1: side of caution, which frankly, you want in the first place. 264 00:15:45,200 --> 00:15:47,040 Speaker 1: You don't want to see a lot of people say 265 00:15:47,600 --> 00:15:50,840 Speaker 1: let's take some chances, when when you're talking about vehicles, 266 00:15:50,880 --> 00:15:55,360 Speaker 1: I mean, they're human lives at stake. Meanwhile, another narrative 267 00:15:55,520 --> 00:15:59,400 Speaker 1: drives home pun intended why a lot of folks got 268 00:15:59,440 --> 00:16:03,560 Speaker 1: really hyped up about autonomous cars. It's also a sobering 269 00:16:04,400 --> 00:16:07,400 Speaker 1: line of thought. And I'm talking, of course, about the 270 00:16:07,480 --> 00:16:11,280 Speaker 1: frequency of fatal car accidents and how many of them 271 00:16:11,520 --> 00:16:15,680 Speaker 1: can be traced back to human error. Now, getting global 272 00:16:16,040 --> 00:16:18,680 Speaker 1: statistics is pretty tough on this, so I'm going to 273 00:16:18,800 --> 00:16:21,160 Speaker 1: focus on the United States because we have a lot 274 00:16:21,200 --> 00:16:25,000 Speaker 1: of organizations in the US that track these kinds of numbers, 275 00:16:25,080 --> 00:16:27,640 Speaker 1: and you can kind of get an idea of how 276 00:16:27,680 --> 00:16:31,800 Speaker 1: big the problem is. So in tween, the National Safety 277 00:16:31,880 --> 00:16:36,040 Speaker 1: Council released a report that stated and estimated forty thousand 278 00:16:36,040 --> 00:16:40,040 Speaker 1: people had died in car accidents in the United States. Uh, 279 00:16:40,160 --> 00:16:43,600 Speaker 1: that actually amounted to a decline of one percent from 280 00:16:43,600 --> 00:16:48,840 Speaker 1: two thousand seventeen. That was when forty one people died. 281 00:16:49,400 --> 00:16:51,960 Speaker 1: Another four and a half million people on top of 282 00:16:52,000 --> 00:16:57,800 Speaker 1: that had become seriously injured in car crashes in Meanwhile, 283 00:16:58,160 --> 00:17:01,600 Speaker 1: the National Highway Traffics say D administration in the United 284 00:17:01,640 --> 00:17:07,359 Speaker 1: States said that of serious car crashes result due to 285 00:17:07,480 --> 00:17:12,119 Speaker 1: human error or dangerous choices. So, in other words, mechanical 286 00:17:12,200 --> 00:17:16,679 Speaker 1: failures only contribute a very small percentage to the overall numbers. 287 00:17:16,680 --> 00:17:20,040 Speaker 1: When it comes to serious car accidents, most serious car 288 00:17:20,080 --> 00:17:25,199 Speaker 1: accidents aren't caused by a tire blowout or you know, 289 00:17:25,640 --> 00:17:29,160 Speaker 1: a car failing in some way. They're caused by humans 290 00:17:29,200 --> 00:17:34,720 Speaker 1: doing something wrong, whether it's totally by accident or someone 291 00:17:34,800 --> 00:17:37,160 Speaker 1: just makes a really bad decision, like they think, oh, 292 00:17:37,200 --> 00:17:40,200 Speaker 1: sure there's no there's no dashed line here, but I'm 293 00:17:40,200 --> 00:17:42,320 Speaker 1: gonna go ahead and try and pass this person on 294 00:17:42,359 --> 00:17:46,399 Speaker 1: this windy rural road because I bet nobody's coming the 295 00:17:46,440 --> 00:17:49,680 Speaker 1: other way. That's what we would call a bad decision, 296 00:17:50,320 --> 00:17:55,040 Speaker 1: So says the tech optimist. If you could create autonomous 297 00:17:55,080 --> 00:17:59,800 Speaker 1: cars that operates safely, you could eliminate the vast majority 298 00:18:00,040 --> 00:18:03,960 Speaker 1: of car crashes and thus fatalities on the road. You 299 00:18:04,040 --> 00:18:07,560 Speaker 1: just remove the human error element, and suddenly you're talking 300 00:18:07,720 --> 00:18:14,120 Speaker 1: about a staggering result, and that is an incredibly powerful motivator. 301 00:18:14,600 --> 00:18:19,960 Speaker 1: Tens of thousands of people wouldn't die each year from 302 00:18:20,000 --> 00:18:23,800 Speaker 1: these car accidents. Millions more would never be injured or 303 00:18:23,880 --> 00:18:27,120 Speaker 1: affected by the tragic loss of a loved one from 304 00:18:27,119 --> 00:18:31,080 Speaker 1: an accident. Then you start moving outward, you go out 305 00:18:31,119 --> 00:18:33,439 Speaker 1: another circle. You think of this as a ripple effect 306 00:18:33,440 --> 00:18:37,320 Speaker 1: and you think, imagine all the contributions those people might 307 00:18:37,440 --> 00:18:40,040 Speaker 1: make in the future that they'll get a chance to 308 00:18:40,119 --> 00:18:44,200 Speaker 1: make because they wouldn't have had this terrible car crash. 309 00:18:44,240 --> 00:18:47,520 Speaker 1: These are things we never would see come to fruition 310 00:18:47,840 --> 00:18:50,119 Speaker 1: if they were to get in a fatal car crash, 311 00:18:50,440 --> 00:18:53,639 Speaker 1: and it becomes this butterfly effect issue. And of course, 312 00:18:54,160 --> 00:18:57,800 Speaker 1: we want to make the roads safer for everyone. Now, 313 00:18:58,080 --> 00:19:00,879 Speaker 1: I'm sure all of you have already hit upon the 314 00:19:00,960 --> 00:19:04,960 Speaker 1: major issue here. The whole concept of people being safer 315 00:19:05,000 --> 00:19:10,359 Speaker 1: in autonomous cars is contingent upon those autonomous cars performing 316 00:19:10,480 --> 00:19:14,119 Speaker 1: better than humans already do and in every type of 317 00:19:14,160 --> 00:19:17,760 Speaker 1: situation in which humans find themselves driving in. If we 318 00:19:17,840 --> 00:19:22,200 Speaker 1: can't get that right, then we haven't made things safer 319 00:19:22,359 --> 00:19:25,439 Speaker 1: at all. All we would have done is shifted the 320 00:19:25,600 --> 00:19:29,760 Speaker 1: cause of the accidents from human error to machine error 321 00:19:29,920 --> 00:19:33,919 Speaker 1: or computer error. So we must be absolutely certain that 322 00:19:33,960 --> 00:19:37,800 Speaker 1: the vehicles we make meet a very high standard if 323 00:19:37,920 --> 00:19:42,600 Speaker 1: our goal is to reduce car accidents. So we have 324 00:19:42,680 --> 00:19:46,400 Speaker 1: to prove that these machines operate better than people do 325 00:19:46,880 --> 00:19:50,120 Speaker 1: in all the different situations people find themselves driving in 326 00:19:50,400 --> 00:19:53,280 Speaker 1: before we can make any sort of declarative statement of 327 00:19:53,359 --> 00:19:56,440 Speaker 1: this is the best way forward. Now, when we come back, 328 00:19:56,720 --> 00:20:00,760 Speaker 1: I'll talk about why this gets super tricky, and talk 329 00:20:00,960 --> 00:20:05,080 Speaker 1: about thought experiments and things, and and also some real 330 00:20:05,119 --> 00:20:08,880 Speaker 1: world scenarios that kind of illustrate why this is harder 331 00:20:08,920 --> 00:20:12,280 Speaker 1: than what it sounds. But first, let's take a quick break. 332 00:20:19,760 --> 00:20:23,240 Speaker 1: Before the break, I positive that a future with autonomous 333 00:20:23,320 --> 00:20:27,280 Speaker 1: cars that all but eliminate fatal car crashes hinge upon 334 00:20:28,040 --> 00:20:31,480 Speaker 1: building driverless vehicles that are much better at driving cars 335 00:20:31,520 --> 00:20:34,520 Speaker 1: than humans are in all situations. Now, we could get 336 00:20:34,760 --> 00:20:38,879 Speaker 1: a bit more lucy goosey here, but doing so brings 337 00:20:38,960 --> 00:20:42,320 Speaker 1: up some tough ethical issues. So, for example, what if 338 00:20:42,359 --> 00:20:46,920 Speaker 1: we knew that machines were better. Right, autonomous cars are 339 00:20:47,040 --> 00:20:51,040 Speaker 1: better than human drivers, but they are by no means perfect. 340 00:20:51,680 --> 00:20:54,359 Speaker 1: So what if we could be certain that autonomous cars, 341 00:20:54,440 --> 00:20:59,400 Speaker 1: if widely adopted, would reduce those fatalities by half, for example, 342 00:20:59,680 --> 00:21:02,320 Speaker 1: but they would still be at fault in the case 343 00:21:02,400 --> 00:21:06,120 Speaker 1: of the other half of fatal car accidents. So let's 344 00:21:06,119 --> 00:21:09,640 Speaker 1: say it's you know, I don't know, and we have 345 00:21:10,680 --> 00:21:16,200 Speaker 1: level four autonomous cars that are pretty reliably level four 346 00:21:16,760 --> 00:21:20,200 Speaker 1: and they are better as a whole than human drivers are. 347 00:21:20,480 --> 00:21:24,280 Speaker 1: So we've seen a vast reduction in vehicles operated by humans. 348 00:21:24,480 --> 00:21:27,159 Speaker 1: And let's even assume that most cars are now controlled 349 00:21:27,160 --> 00:21:31,160 Speaker 1: by computers, but let's also assume they're not perfect now. 350 00:21:31,400 --> 00:21:35,080 Speaker 1: Using the numbers from if humans were still in control, 351 00:21:35,280 --> 00:21:38,640 Speaker 1: we would expect to see another forty thousand fatalities due 352 00:21:38,680 --> 00:21:41,239 Speaker 1: to human error. And I'm just using that number as 353 00:21:41,240 --> 00:21:43,600 Speaker 1: an example. I realized that in reality we'd be talking 354 00:21:43,600 --> 00:21:48,159 Speaker 1: about nine of forty thousand. But now that cars are 355 00:21:48,200 --> 00:21:52,320 Speaker 1: in control, it means half of those accidents are totally prevented. 356 00:21:52,920 --> 00:21:57,440 Speaker 1: But we still see fatal accidents that claim twenty thousand lives. 357 00:21:57,480 --> 00:22:00,000 Speaker 1: On the one hand, we could look at that scenari 358 00:22:00,000 --> 00:22:02,920 Speaker 1: are you and say, based upon what we know from 359 00:22:02,960 --> 00:22:06,360 Speaker 1: past experience, we would have seen many more people die 360 00:22:06,520 --> 00:22:10,359 Speaker 1: in accidents if humans were actually still operating cars. But 361 00:22:10,440 --> 00:22:12,760 Speaker 1: on the other hand, that's all hypothetical, right, I mean, 362 00:22:13,200 --> 00:22:17,160 Speaker 1: we can only know anything based on what actually happened, 363 00:22:17,320 --> 00:22:20,160 Speaker 1: not on what might have happened if things had gone 364 00:22:20,200 --> 00:22:24,160 Speaker 1: a different way. We can't be sure. But more than that, though, 365 00:22:24,200 --> 00:22:28,480 Speaker 1: we're still talking about twenty thousand people losing their lives 366 00:22:28,520 --> 00:22:32,199 Speaker 1: and all the ripple effects that that makes throughout society, 367 00:22:32,280 --> 00:22:35,840 Speaker 1: and moreover, we have machines that are at the fault 368 00:22:36,080 --> 00:22:38,760 Speaker 1: for those twenty thousand lives being lost, and the idea 369 00:22:38,840 --> 00:22:42,600 Speaker 1: that people have built machines that, through a failure of 370 00:22:42,640 --> 00:22:46,760 Speaker 1: some sort or another, resulted in deaths is a very 371 00:22:46,840 --> 00:22:51,159 Speaker 1: difficult proposition to accept. Also, it's just a key to 372 00:22:51,280 --> 00:22:54,400 Speaker 1: think of in terms like that. I mean, clearly, one 373 00:22:54,560 --> 00:22:58,000 Speaker 1: death is too many. We don't want to see anyone 374 00:22:58,160 --> 00:23:01,680 Speaker 1: die in a car accident. Having a discussion in which 375 00:23:01,720 --> 00:23:05,000 Speaker 1: you compare a fewer number of deaths and referring to 376 00:23:05,040 --> 00:23:08,840 Speaker 1: it as quote unquote better is something that's pretty hard 377 00:23:08,880 --> 00:23:11,600 Speaker 1: for us to process. It's easier to do it the 378 00:23:11,640 --> 00:23:14,960 Speaker 1: other way, right, I mean, it's obvious that forty thousand 379 00:23:14,960 --> 00:23:18,960 Speaker 1: people dying is worse than twenty thou people dying, But 380 00:23:19,040 --> 00:23:22,120 Speaker 1: it's hard to view it the other way because anyone 381 00:23:22,320 --> 00:23:25,879 Speaker 1: dying at all is awful. Now. Part of this also 382 00:23:25,920 --> 00:23:29,640 Speaker 1: really boils down to a fear of handing over control 383 00:23:29,720 --> 00:23:32,679 Speaker 1: to a machine. I know a lot of people bulk 384 00:23:33,000 --> 00:23:35,760 Speaker 1: at that idea. They don't like the idea of not 385 00:23:35,920 --> 00:23:40,520 Speaker 1: being the actual entity making decisions behind the wheel. Confronting 386 00:23:40,560 --> 00:23:44,440 Speaker 1: them with statistics showing how human error leads to catastrophe, 387 00:23:44,880 --> 00:23:47,840 Speaker 1: doesn't tend to sway them. I mean a lot of 388 00:23:47,840 --> 00:23:52,560 Speaker 1: people think, well, yeah, that's other people. I am not 389 00:23:52,720 --> 00:23:55,760 Speaker 1: that person. Also, to be fair, we don't have the 390 00:23:55,840 --> 00:23:59,600 Speaker 1: evidence to show that computers would necessarily be better, so 391 00:23:59,760 --> 00:24:03,240 Speaker 1: they're something to that right now. Okay, let's let's get 392 00:24:03,240 --> 00:24:05,399 Speaker 1: back to where we were in our history. The Grand 393 00:24:05,440 --> 00:24:08,679 Speaker 1: Challenges helped set the stage for the next phase of development, 394 00:24:09,040 --> 00:24:13,200 Speaker 1: which was mostly the realm of startups and some big companies, 395 00:24:13,280 --> 00:24:17,680 Speaker 1: namely Google, would hire participants from the Grand and Urban 396 00:24:17,760 --> 00:24:21,000 Speaker 1: Challenges to come and work in new divisions dedicated to 397 00:24:21,040 --> 00:24:25,520 Speaker 1: creating driverless cars. The early pioneering work was now shifting 398 00:24:25,760 --> 00:24:30,399 Speaker 1: pun intended into a phase of rapid iteration, as engineers 399 00:24:30,400 --> 00:24:34,960 Speaker 1: and computer scientists and mechanics began to refine technologies to 400 00:24:34,960 --> 00:24:38,840 Speaker 1: help make them better. So going from the first sort 401 00:24:38,880 --> 00:24:42,240 Speaker 1: of proof of concept approach to how do we make 402 00:24:42,320 --> 00:24:45,000 Speaker 1: this a better design so it does the thing it 403 00:24:45,040 --> 00:24:49,479 Speaker 1: does but more effectively. Google's program began in earnest around 404 00:24:49,480 --> 00:24:53,199 Speaker 1: two thousand nine, not long after the Urban Challenge. In 405 00:24:53,240 --> 00:24:57,119 Speaker 1: twenty ten, publications began to report on the project. So 406 00:24:57,119 --> 00:25:00,399 Speaker 1: it's been secret for about a year, maybe almost two years. 407 00:25:00,880 --> 00:25:04,959 Speaker 1: Google had been testing vehicles in and around the Mountain View, California, 408 00:25:05,040 --> 00:25:07,760 Speaker 1: headquarters for the company. And while the vehicles still had 409 00:25:07,800 --> 00:25:10,880 Speaker 1: manual controls and they still had a driver behind the wheel, 410 00:25:11,320 --> 00:25:13,680 Speaker 1: there were at least some segments of some of these 411 00:25:13,720 --> 00:25:16,560 Speaker 1: test drives that felt totally under the control of the 412 00:25:16,640 --> 00:25:20,520 Speaker 1: vehicle itself. It was ranking up miles of autonomous driving experience. 413 00:25:20,680 --> 00:25:23,080 Speaker 1: It was gathering data, and those people who are working 414 00:25:23,160 --> 00:25:27,080 Speaker 1: in the division use that data to further refine their approach. 415 00:25:27,560 --> 00:25:31,280 Speaker 1: By the company had logged more than one hundred forty 416 00:25:31,400 --> 00:25:35,520 Speaker 1: thousand miles driven by autonomous vehicles, which equals out to 417 00:25:35,600 --> 00:25:40,760 Speaker 1: around two five thousand kilometers. And that's pretty, you know, 418 00:25:41,040 --> 00:25:44,560 Speaker 1: respectable distance. But let's compare that against the miles that 419 00:25:44,600 --> 00:25:47,679 Speaker 1: were driven by human drivers in the United States. So 420 00:25:47,720 --> 00:25:57,119 Speaker 1: in US drivers accumulated nearly three trillion miles trillion. So 421 00:25:57,160 --> 00:25:59,240 Speaker 1: that means if you were to do a percentage and 422 00:25:59,280 --> 00:26:01,600 Speaker 1: you were to say how many how much percentage of 423 00:26:01,640 --> 00:26:05,760 Speaker 1: miles did Google cars drive compared to human drivers in 424 00:26:05,800 --> 00:26:09,920 Speaker 1: the US, in the Google cars would account for about 425 00:26:10,160 --> 00:26:16,639 Speaker 1: point zero zero zero zero zero four seven percent of 426 00:26:16,680 --> 00:26:21,399 Speaker 1: all miles traveled vehicle miles. So not you you can 427 00:26:21,480 --> 00:26:23,320 Speaker 1: call it a fraction of a percent, But even that 428 00:26:23,480 --> 00:26:25,480 Speaker 1: is being generous. It's a fraction of a fraction of 429 00:26:25,520 --> 00:26:29,760 Speaker 1: a fraction of a percent. Now, if you're familiar with 430 00:26:30,400 --> 00:26:32,919 Speaker 1: the idea of things like conducting surveys, you know that 431 00:26:33,080 --> 00:26:36,760 Speaker 1: sample size is really important. Right, If you ask five 432 00:26:36,760 --> 00:26:40,879 Speaker 1: people a question, extrapolating those five answers to try and 433 00:26:40,920 --> 00:26:44,120 Speaker 1: apply it to the population at large is a bad idea. 434 00:26:44,480 --> 00:26:46,880 Speaker 1: It's not a good sample size. You don't have enough 435 00:26:46,960 --> 00:26:51,359 Speaker 1: data to draw any conclusions. It's definitely bad science. So 436 00:26:51,400 --> 00:26:54,480 Speaker 1: it makes little sense to compare the results of autonomous 437 00:26:54,520 --> 00:26:59,080 Speaker 1: vehicles that haven't even come close to accumulating a percentage 438 00:26:59,560 --> 00:27:02,080 Speaker 1: of the my was driven by the population at large. 439 00:27:02,080 --> 00:27:06,440 Speaker 1: You cannot compare the two because the experience is so 440 00:27:06,800 --> 00:27:12,440 Speaker 1: monumentally different. Now, for several years, Google's cars operated without 441 00:27:12,480 --> 00:27:15,399 Speaker 1: any accidents, at least not any that were the fault 442 00:27:15,520 --> 00:27:19,440 Speaker 1: of the driverless car itself. There were a few incidents, 443 00:27:19,840 --> 00:27:23,480 Speaker 1: but they either happened when the safety driver was operating 444 00:27:23,480 --> 00:27:26,240 Speaker 1: the car, so a human driver was driving the Google 445 00:27:26,280 --> 00:27:31,040 Speaker 1: car not autonomous vehicle mode, or there there was the 446 00:27:31,080 --> 00:27:33,840 Speaker 1: fault of some other driver. Right, someone in a totally 447 00:27:33,840 --> 00:27:36,360 Speaker 1: different car got into an accident with a Google car, 448 00:27:36,400 --> 00:27:38,720 Speaker 1: and it wasn't the fault of the autonomous system, but 449 00:27:38,840 --> 00:27:41,720 Speaker 1: rather the other driver. Those were really the only two 450 00:27:41,800 --> 00:27:44,679 Speaker 1: kind of categories of incidents that happened in the early 451 00:27:44,800 --> 00:27:48,520 Speaker 1: days of Google's testing. So at first glance, it looked 452 00:27:48,520 --> 00:27:51,400 Speaker 1: like the driverless cars were truly safer than a human 453 00:27:51,440 --> 00:27:53,959 Speaker 1: operated vehicle. Right, They had a much better record than 454 00:27:54,040 --> 00:27:57,520 Speaker 1: human drivers did, and it may very well be the 455 00:27:57,560 --> 00:28:00,879 Speaker 1: case that they were in fact much much safer than 456 00:28:00,960 --> 00:28:03,840 Speaker 1: human drivers. But we have to go back to the 457 00:28:03,920 --> 00:28:08,360 Speaker 1: sense of scale here. So in the United States, drivers 458 00:28:08,400 --> 00:28:11,920 Speaker 1: travel more than three trillion miles by vehicle per year. 459 00:28:12,200 --> 00:28:14,600 Speaker 1: I think the most recent one was almost three point 460 00:28:14,640 --> 00:28:19,200 Speaker 1: three trillion. We're getting ridiculously high in numbers, and there 461 00:28:19,240 --> 00:28:23,280 Speaker 1: are around forty thousand fatalities per year. And for the 462 00:28:23,320 --> 00:28:26,359 Speaker 1: sake of this example, will assume all of those fatalities 463 00:28:26,359 --> 00:28:29,040 Speaker 1: were caused by human error or bad decisions, just to 464 00:28:29,280 --> 00:28:32,320 Speaker 1: simplify things. So if we do some rough math, we'll 465 00:28:32,320 --> 00:28:35,000 Speaker 1: see that that amounts to one death per seventy five 466 00:28:35,119 --> 00:28:39,520 Speaker 1: million miles driven. Now that's my estimate just based on 467 00:28:39,720 --> 00:28:42,600 Speaker 1: back of the napkin. The actual estimates even more generous 468 00:28:42,600 --> 00:28:45,880 Speaker 1: than that. The National Safety Council estimates that there's one 469 00:28:45,960 --> 00:28:50,560 Speaker 1: point to five deaths per one hundred million vehicle miles driven. 470 00:28:51,440 --> 00:28:54,440 Speaker 1: So what does that mean for autonomous vehicles, Well, they 471 00:28:54,560 --> 00:28:58,160 Speaker 1: haven't driven close to a hundred million vehicle miles. It 472 00:28:58,200 --> 00:29:00,680 Speaker 1: means those early days when we first learned that Google 473 00:29:00,720 --> 00:29:04,000 Speaker 1: had launched its project, there were so few miles accumulated 474 00:29:04,280 --> 00:29:07,640 Speaker 1: that you can't draw any meaningful conclusions. Now. To be fair, 475 00:29:07,880 --> 00:29:10,560 Speaker 1: I don't think many people were trying to argue that 476 00:29:10,640 --> 00:29:14,040 Speaker 1: autonomous car technology as it was in two ten, was 477 00:29:14,120 --> 00:29:17,719 Speaker 1: already clearly superior to human driving. This was still an 478 00:29:17,720 --> 00:29:20,320 Speaker 1: early testing phase. This was a point where it wasn't 479 00:29:20,360 --> 00:29:23,680 Speaker 1: about showing that the technology was already better than humans. 480 00:29:23,920 --> 00:29:27,160 Speaker 1: It was rather showing, hey, we've created technology that will 481 00:29:27,200 --> 00:29:32,360 Speaker 1: allow this card to navigate and maneuver through human environments 482 00:29:32,680 --> 00:29:35,320 Speaker 1: without making it a problem. So it wasn't even that 483 00:29:35,560 --> 00:29:39,440 Speaker 1: our our standard is higher than human capability. It's more like, 484 00:29:39,840 --> 00:29:43,200 Speaker 1: can this machine operate at the same level as humans 485 00:29:43,920 --> 00:29:49,080 Speaker 1: within certain parameters pretty restrictive parameters. Skip ahead a few years, 486 00:29:49,320 --> 00:29:53,960 Speaker 1: several companies invested in driverless car technologies that included big 487 00:29:54,000 --> 00:29:57,720 Speaker 1: car companies, you know Toyota and Chrysler and others GM 488 00:29:57,800 --> 00:30:01,320 Speaker 1: they've all invested huge amounts of money in autonomous car 489 00:30:01,480 --> 00:30:05,320 Speaker 1: research and development. UH. It also included startup companies independent 490 00:30:05,320 --> 00:30:09,240 Speaker 1: startups that either we're working on components for autonomous cars 491 00:30:09,280 --> 00:30:12,800 Speaker 1: like light our systems, or they were attempting to convert 492 00:30:12,960 --> 00:30:17,000 Speaker 1: or build fully autonomous vehicles themselves. And then there were 493 00:30:17,280 --> 00:30:21,560 Speaker 1: ride hailing companies, most notably Uber, that we're also investing 494 00:30:22,040 --> 00:30:25,760 Speaker 1: billions of dollars in this technology with an eye on 495 00:30:25,840 --> 00:30:30,200 Speaker 1: replacing the fleets of human operated vehicles that were the 496 00:30:30,200 --> 00:30:33,800 Speaker 1: bread and butter of their company to uh turn them 497 00:30:33,800 --> 00:30:37,720 Speaker 1: all over to robotaxis. So instead of having human drivers 498 00:30:37,720 --> 00:30:40,680 Speaker 1: over at Uber, you know, Uber at the highest level 499 00:30:40,960 --> 00:30:45,600 Speaker 1: wants to replace them with autonomous vehicles for reasons that 500 00:30:45,720 --> 00:30:49,440 Speaker 1: are complex but mostly come down to money. So meanwhile, 501 00:30:49,880 --> 00:30:54,080 Speaker 1: consumer vehicles were getting more and more sophisticated, and higher 502 00:30:54,160 --> 00:30:58,360 Speaker 1: end vehicles started sporting some really nifty features that relate 503 00:30:58,520 --> 00:31:02,920 Speaker 1: to autonomous cars are semi autonomous in themselves. Some of 504 00:31:02,920 --> 00:31:06,600 Speaker 1: them are more modest, like lane assist or breaking assist 505 00:31:06,760 --> 00:31:11,720 Speaker 1: safety features. Some are a little more spectacular, like the 506 00:31:11,800 --> 00:31:15,200 Speaker 1: self parking capabilities that some cars have where they can 507 00:31:15,480 --> 00:31:18,720 Speaker 1: park themselves and and pull out of parking spaces all 508 00:31:18,760 --> 00:31:22,760 Speaker 1: by themselves, like that's pretty cool. They weren't intended to 509 00:31:22,800 --> 00:31:26,520 Speaker 1: make consumer cars autonomous, but were rather positioned as sort 510 00:31:26,520 --> 00:31:29,640 Speaker 1: of value added options for cars, like this is something 511 00:31:29,720 --> 00:31:32,560 Speaker 1: nifty this car has, other cars don't have it. Don't 512 00:31:32,560 --> 00:31:34,760 Speaker 1: you want to buy this car? And they give a 513 00:31:34,840 --> 00:31:39,200 Speaker 1: hint of what might be in days to come. In 514 00:31:40,480 --> 00:31:45,000 Speaker 1: Elon Musk started talking about an autopilot like feature for cars, 515 00:31:45,320 --> 00:31:48,800 Speaker 1: and sure enough, the following year, Tesla unveiled a driver 516 00:31:48,920 --> 00:31:54,000 Speaker 1: assist suite of features called autopilot. Now, personally, and I've 517 00:31:54,000 --> 00:31:57,440 Speaker 1: talked about this before, I think naming it autopilot was 518 00:31:57,480 --> 00:32:00,440 Speaker 1: the wrong move. I feel like the word auto pilot 519 00:32:00,760 --> 00:32:03,760 Speaker 1: has a loaded meaning to it. It conveys a sense 520 00:32:03,800 --> 00:32:06,680 Speaker 1: that the car will take care of everything for you, 521 00:32:06,920 --> 00:32:09,560 Speaker 1: and that's not necessarily the case. In fact, that's not 522 00:32:09,640 --> 00:32:12,520 Speaker 1: the case at all. The company tried to walk that 523 00:32:12,600 --> 00:32:15,920 Speaker 1: back a bit, uh not by renaming it, which I 524 00:32:15,920 --> 00:32:19,680 Speaker 1: think they needed to do, but they included messages, and 525 00:32:19,800 --> 00:32:21,840 Speaker 1: this they also need to do, But they included messages 526 00:32:21,880 --> 00:32:25,200 Speaker 1: that said drivers were not meant to remove their hands 527 00:32:25,240 --> 00:32:27,880 Speaker 1: from the wheel or to take their attention away from 528 00:32:27,920 --> 00:32:32,120 Speaker 1: the road, that these systems can assist, but they don't 529 00:32:32,120 --> 00:32:35,400 Speaker 1: replace the need for a driver, and you have to 530 00:32:35,480 --> 00:32:39,959 Speaker 1: agree to that before you can enable the autopilot feature. 531 00:32:40,520 --> 00:32:43,280 Speaker 1: So the goal was saying, well, you have to acknowledge 532 00:32:43,280 --> 00:32:45,400 Speaker 1: the fact that no, this is not meant for it 533 00:32:45,440 --> 00:32:48,040 Speaker 1: to be an autonomous car, and not to go off 534 00:32:48,080 --> 00:32:50,360 Speaker 1: on too much of a tangent. But I feel as 535 00:32:50,360 --> 00:32:54,360 Speaker 1: though Elon Musk might be a little too aggressive with 536 00:32:54,480 --> 00:32:57,880 Speaker 1: his projections about autonomous cars. And I don't mean to 537 00:32:57,880 --> 00:33:01,560 Speaker 1: suggest that Elon Musk and Tesla are interchangeable. I do 538 00:33:01,640 --> 00:33:04,680 Speaker 1: see that happening a lot in techt circles, where people 539 00:33:04,720 --> 00:33:08,200 Speaker 1: will use one or the other interchangeably, and they are 540 00:33:08,240 --> 00:33:13,720 Speaker 1: two different entities. But maybe Tesla the company's bravado stems 541 00:33:13,800 --> 00:33:18,400 Speaker 1: from Elon Musk's own personality. But whatever the case, autopilot 542 00:33:18,440 --> 00:33:22,200 Speaker 1: has proven to have its own limitations, and we saw 543 00:33:22,280 --> 00:33:26,520 Speaker 1: that manifest in some rather high profile and tragic accidents. 544 00:33:27,320 --> 00:33:32,160 Speaker 1: Beginning in ten, there have been several fatal accidents involving 545 00:33:32,200 --> 00:33:36,280 Speaker 1: Tesla vehicles operating in autopilot mode. The first one took 546 00:33:36,280 --> 00:33:40,160 Speaker 1: place on January twenty, two thousand sixteen, in China, and 547 00:33:40,200 --> 00:33:43,520 Speaker 1: the most recent examples I know about took place on 548 00:33:43,600 --> 00:33:48,480 Speaker 1: December two, thousand, nineteen, and there are actually two crashes 549 00:33:48,480 --> 00:33:53,200 Speaker 1: with fatalities that day involving Tesla vehicles reportedly engaged in autopilot. 550 00:33:53,240 --> 00:33:56,400 Speaker 1: I say reportedly, because I don't have access to all 551 00:33:56,480 --> 00:34:01,320 Speaker 1: the data, I don't know if conclusively they've discovered that 552 00:34:01,400 --> 00:34:04,320 Speaker 1: both of these vehicles were actually operating an autopilot mode. 553 00:34:04,520 --> 00:34:08,040 Speaker 1: One of these happening California and the other happened in Indiana, 554 00:34:08,200 --> 00:34:12,360 Speaker 1: both in the United States on December nineteen. Now, Tesla 555 00:34:12,520 --> 00:34:15,880 Speaker 1: states that autopilot is meant as a driver assist feature 556 00:34:16,000 --> 00:34:18,600 Speaker 1: and it's only semi autonomous. But at the same time, 557 00:34:18,640 --> 00:34:21,120 Speaker 1: Elon Musk has said repeatedly that his goal was to 558 00:34:21,160 --> 00:34:24,360 Speaker 1: get a fully autonomous vehicle on the road by the 559 00:34:24,440 --> 00:34:27,120 Speaker 1: end of twenty nineteen, which now has been pushed back 560 00:34:27,160 --> 00:34:30,480 Speaker 1: to sometime in the first quarter of so there are 561 00:34:30,560 --> 00:34:34,239 Speaker 1: some conflicting messages coming out. Since a fully autonomous car 562 00:34:34,560 --> 00:34:36,480 Speaker 1: and I'm talking about something that we would at least 563 00:34:36,480 --> 00:34:40,000 Speaker 1: classify as level four, if not level five, is well 564 00:34:40,040 --> 00:34:44,040 Speaker 1: beyond just a driver assist mode. And I should also 565 00:34:44,040 --> 00:34:47,960 Speaker 1: add that Tesla drivers have a responsibility to use these 566 00:34:47,960 --> 00:34:52,640 Speaker 1: features safely and as intended. If someone is taking their 567 00:34:52,680 --> 00:34:55,879 Speaker 1: attention off the road, or they're sitting back from their 568 00:34:55,920 --> 00:34:59,080 Speaker 1: steering wheel, or they're taking a nap, or they're watching 569 00:34:59,160 --> 00:35:04,560 Speaker 1: Netflix or whatever, that's dangerously irresponsible behavior, and they are 570 00:35:04,640 --> 00:35:07,759 Speaker 1: accountable for it. I don't want to give the implication 571 00:35:07,800 --> 00:35:10,600 Speaker 1: to you guys that I think Tesla the company is 572 00:35:10,680 --> 00:35:13,160 Speaker 1: fully to blame in this case. I actually think it's 573 00:35:13,160 --> 00:35:16,759 Speaker 1: a shared responsibility, and that you've got some drivers who 574 00:35:16,800 --> 00:35:20,480 Speaker 1: are eager to test out admittedly really cool and technologically 575 00:35:20,520 --> 00:35:24,520 Speaker 1: advanced features, and you have a company that might message 576 00:35:24,560 --> 00:35:27,719 Speaker 1: out these features in a way that isn't perhaps the 577 00:35:27,760 --> 00:35:33,359 Speaker 1: most realistic or responsible method. It's a really bad combination, right. 578 00:35:33,440 --> 00:35:36,680 Speaker 1: You've got people who are tech heads who are eager 579 00:35:36,719 --> 00:35:39,120 Speaker 1: to play with the newest stuff. You've got a company 580 00:35:39,160 --> 00:35:44,239 Speaker 1: that's Bill's reputation on creating super cool new stuff. It's 581 00:35:44,239 --> 00:35:47,879 Speaker 1: only natural that you get when you combine those two, 582 00:35:47,960 --> 00:35:50,600 Speaker 1: you can get some bad situations if they haven't been 583 00:35:50,640 --> 00:35:54,759 Speaker 1: messaged properly. And I really feel that Tesla bungledness that 584 00:35:55,640 --> 00:35:57,759 Speaker 1: the rollout needed to be done in such a way 585 00:35:58,120 --> 00:36:02,000 Speaker 1: where there was never the implicate atian that this was 586 00:36:02,160 --> 00:36:06,359 Speaker 1: an autonomous mode. Uh. Saying hey it's not autonomous after 587 00:36:06,360 --> 00:36:08,560 Speaker 1: you've already called it autopilot and put the idea in 588 00:36:08,600 --> 00:36:11,280 Speaker 1: the into people's heads is a little late in the game. 589 00:36:11,800 --> 00:36:15,960 Speaker 1: So I think that that all parties here share accountability. 590 00:36:16,160 --> 00:36:19,160 Speaker 1: It's not just Tesla the company's fault, and it's not 591 00:36:19,560 --> 00:36:22,480 Speaker 1: entirely the driver's faults, although I think it's more their 592 00:36:22,520 --> 00:36:25,480 Speaker 1: fault than the company's. Honestly, I mean, we're all adults, right, 593 00:36:25,800 --> 00:36:27,759 Speaker 1: you should be if you're driving a car, and if 594 00:36:27,800 --> 00:36:29,239 Speaker 1: you're an adult, you should be able to make the 595 00:36:29,280 --> 00:36:32,560 Speaker 1: determination of hey, this is a bad idea. I should 596 00:36:32,560 --> 00:36:36,120 Speaker 1: also add that Tesla is not the only company that 597 00:36:36,200 --> 00:36:40,440 Speaker 1: has had autonomous or semi autonomous vehicles involved in fatal accidents. 598 00:36:40,840 --> 00:36:44,360 Speaker 1: There was a case in Tempe, Arizona, involving Evolvo that 599 00:36:44,480 --> 00:36:48,600 Speaker 1: had been converted into a semi autonomous vehicle that was 600 00:36:49,080 --> 00:36:52,839 Speaker 1: being operated under Uber and that car hit a pedestrian 601 00:36:52,880 --> 00:36:56,160 Speaker 1: while an autonomous mode, and the pedestrian died as a 602 00:36:56,200 --> 00:36:59,800 Speaker 1: result of that accident. So Tesla is not the only 603 00:37:00,000 --> 00:37:05,120 Speaker 1: company that has had tragedy befall it due to you know, 604 00:37:05,280 --> 00:37:10,000 Speaker 1: failures in autonomous systems. Getting back to the scale argument 605 00:37:10,080 --> 00:37:13,799 Speaker 1: for a second, when we're talking about autonomous systems allegedly 606 00:37:13,880 --> 00:37:17,279 Speaker 1: at fault for accidents that lead to fewer than a 607 00:37:17,320 --> 00:37:21,000 Speaker 1: dozen deaths, you could say, like, well, it's all tragic. 608 00:37:21,280 --> 00:37:24,400 Speaker 1: You never want to see anyone die. One death is 609 00:37:24,400 --> 00:37:28,000 Speaker 1: really too many, but still twelve less than twelve, that's 610 00:37:28,040 --> 00:37:30,680 Speaker 1: that's so much fewer than you know, forty thousand. And 611 00:37:30,719 --> 00:37:33,080 Speaker 1: you might be tempted to say these are tragic accidents, 612 00:37:33,320 --> 00:37:35,040 Speaker 1: but if you look at how many are caused by humans, 613 00:37:35,080 --> 00:37:37,799 Speaker 1: there's really no comparison. But once again, you have to 614 00:37:37,800 --> 00:37:41,480 Speaker 1: remember that humans account for way more vehicle miles traveled 615 00:37:41,600 --> 00:37:45,200 Speaker 1: by several orders of magnitude. So really the only way 616 00:37:45,200 --> 00:37:48,719 Speaker 1: you could compare the two is if you had autonomous 617 00:37:48,760 --> 00:37:52,480 Speaker 1: systems driving as many miles as humans are driving, and 618 00:37:52,520 --> 00:37:55,400 Speaker 1: then you'd have to see if they still stacked up favorably, 619 00:37:55,440 --> 00:37:59,239 Speaker 1: if those numbers were still matching up are still mismatched, 620 00:37:59,239 --> 00:38:00,600 Speaker 1: like if if a ton of this car is still 621 00:38:00,600 --> 00:38:05,920 Speaker 1: accounted for, you know, uh, significantly fewer accidents. But we 622 00:38:05,960 --> 00:38:09,239 Speaker 1: can't say that because the autonomous cars are driving far 623 00:38:09,400 --> 00:38:12,759 Speaker 1: fewer miles than humans are. So it is true that 624 00:38:12,840 --> 00:38:16,280 Speaker 1: most accidents involving autonomous vehicles seemed to be the fault 625 00:38:16,400 --> 00:38:19,239 Speaker 1: of human drivers. You know, it's not like most of 626 00:38:19,280 --> 00:38:22,160 Speaker 1: the accidents we hear about were caused by the autonomous 627 00:38:22,200 --> 00:38:26,000 Speaker 1: vehicles themselves. It tends to be that someone else, some 628 00:38:26,120 --> 00:38:31,920 Speaker 1: other human, caused the accident. But the case of these fatalities, 629 00:38:31,960 --> 00:38:34,600 Speaker 1: it does look like it was the autonomous system at fault, 630 00:38:34,640 --> 00:38:40,239 Speaker 1: and that's truly truly concerning um. And also, you know, 631 00:38:40,320 --> 00:38:44,080 Speaker 1: when when it's when it's a person who's at fault. 632 00:38:44,800 --> 00:38:48,360 Speaker 1: We understand that people make mistakes, and we can feel, 633 00:38:48,920 --> 00:38:51,240 Speaker 1: at least in some cases, we can feel some sympathy 634 00:38:51,239 --> 00:38:55,239 Speaker 1: for a person where perhaps the situation was truly out 635 00:38:55,239 --> 00:39:01,120 Speaker 1: of their control, that that situation was was partcularly extreme 636 00:39:02,120 --> 00:39:05,080 Speaker 1: or unusual, and so we can feel so some sympathy 637 00:39:05,120 --> 00:39:08,080 Speaker 1: for the person. But when it's a machine, then we've 638 00:39:08,120 --> 00:39:12,600 Speaker 1: already surrendered control up to it, and that's where it 639 00:39:12,640 --> 00:39:16,160 Speaker 1: gets particularly scary. You know, we have to trust in 640 00:39:16,239 --> 00:39:19,640 Speaker 1: the machine, and when the machine betrays that trust by failing, 641 00:39:20,040 --> 00:39:22,320 Speaker 1: that's a big problem. So what happens when there are 642 00:39:22,400 --> 00:39:26,000 Speaker 1: no controls at all? The humans can access more on 643 00:39:26,040 --> 00:39:28,160 Speaker 1: that in just a moment but first, let's take another 644 00:39:28,239 --> 00:39:38,799 Speaker 1: quick break. One of the challenges autonomous car companies and 645 00:39:38,880 --> 00:39:42,400 Speaker 1: engineers have faced is how do you balance between computer 646 00:39:42,640 --> 00:39:45,800 Speaker 1: and manual control of a car? You know, how should 647 00:39:45,840 --> 00:39:49,640 Speaker 1: control switch from one to the other. When should an 648 00:39:49,680 --> 00:39:53,839 Speaker 1: automated system take over to avoid an accident like a 649 00:39:53,880 --> 00:39:57,520 Speaker 1: collision prevention system, or when should a driver be able 650 00:39:57,560 --> 00:40:01,720 Speaker 1: to override autonomous commands and bring the vehicle under manual control. 651 00:40:02,400 --> 00:40:04,920 Speaker 1: Doing this is not as straightforward as you might think, 652 00:40:04,960 --> 00:40:08,120 Speaker 1: and and doing it in a way that's safe and 653 00:40:08,200 --> 00:40:12,080 Speaker 1: has a seamless transition of control is really hard. But 654 00:40:13,680 --> 00:40:17,279 Speaker 1: what if there's no question about it at all? Because 655 00:40:17,320 --> 00:40:19,920 Speaker 1: there are no controls to take See back in twenty 656 00:40:20,040 --> 00:40:24,439 Speaker 1: four Google showed off a driverless car prototype that had 657 00:40:24,640 --> 00:40:28,360 Speaker 1: no steering wheel, had no accelerator, no brake pedal, so 658 00:40:28,400 --> 00:40:30,760 Speaker 1: there were no controls for a human to take over. 659 00:40:31,160 --> 00:40:34,680 Speaker 1: The car would only operate autonomously because there were no 660 00:40:34,800 --> 00:40:38,480 Speaker 1: other options. The prototype worked with a smartphone app and 661 00:40:38,520 --> 00:40:42,200 Speaker 1: acted as sort of a ride hailing or robo taxi service. 662 00:40:42,560 --> 00:40:45,839 Speaker 1: Users could summon a car using the app and they 663 00:40:45,840 --> 00:40:48,839 Speaker 1: would indicate where they were wanted to go within a 664 00:40:48,960 --> 00:40:53,360 Speaker 1: very restricted range of operation. Like it was geo fenced, 665 00:40:53,520 --> 00:40:58,000 Speaker 1: so you couldn't go beyond a certain border that was 666 00:40:58,040 --> 00:41:03,040 Speaker 1: pretty limited, and that meant that the vehicle had a 667 00:41:03,040 --> 00:41:06,239 Speaker 1: lot of variables reduced, right it It cut back on 668 00:41:06,280 --> 00:41:09,600 Speaker 1: the types of conditions and routes and situations the car 669 00:41:09,800 --> 00:41:15,440 Speaker 1: might encounter, and thus made the problems of having an 670 00:41:15,480 --> 00:41:22,160 Speaker 1: autonomous car slightly less complicated. There's still opportunities for complications, 671 00:41:22,200 --> 00:41:26,040 Speaker 1: but you've drastically reduced them because you've reduced the variables. Well. 672 00:41:26,080 --> 00:41:28,920 Speaker 1: The vehicle used an electric motor that was good for 673 00:41:28,960 --> 00:41:33,240 Speaker 1: about one miles of driving per charge, and it boasted 674 00:41:33,239 --> 00:41:35,680 Speaker 1: a top speed of twenty five miles per hour. So 675 00:41:35,840 --> 00:41:38,319 Speaker 1: this little car would only really be suitable for transportation 676 00:41:38,400 --> 00:41:41,640 Speaker 1: and restricted situations such as the campus of a big 677 00:41:41,680 --> 00:41:45,200 Speaker 1: company like I don't know, Google. It wasn't intended as 678 00:41:45,200 --> 00:41:49,360 Speaker 1: a practical vehicle for widespread adoption, but rather another iterative 679 00:41:49,520 --> 00:41:54,319 Speaker 1: step towards fully autonomous cars. The robotaxi vision is one 680 00:41:54,360 --> 00:41:56,920 Speaker 1: that tends to be the most common across the autonomous 681 00:41:56,920 --> 00:42:00,120 Speaker 1: car space. That's largely because the technology used to of 682 00:42:00,320 --> 00:42:05,239 Speaker 1: cars autonomy, you know, the the sensors, computers, robotic systems, 683 00:42:05,280 --> 00:42:07,600 Speaker 1: that kind of stuff they don't come cheap, and a 684 00:42:07,719 --> 00:42:12,520 Speaker 1: vehicle would cost significantly more than a manually operated vehicle 685 00:42:12,560 --> 00:42:16,480 Speaker 1: a traditional car, So most experts agree that the future 686 00:42:16,480 --> 00:42:20,120 Speaker 1: of autonomous cars, at least in the near term, will 687 00:42:20,160 --> 00:42:23,040 Speaker 1: be in fleets that are operated by companies like Uber 688 00:42:23,239 --> 00:42:27,160 Speaker 1: or Lift. They will be ride healing vehicles or robo taxis, 689 00:42:27,400 --> 00:42:30,680 Speaker 1: and they will take passengers to their destinations, and then 690 00:42:30,719 --> 00:42:32,839 Speaker 1: those cars will then move on to pick up their 691 00:42:32,880 --> 00:42:36,000 Speaker 1: next fair, or they'll return to some sort of h 692 00:42:36,120 --> 00:42:39,920 Speaker 1: Q for recharging or maintenance or whatever. It's unlikely that 693 00:42:39,960 --> 00:42:43,200 Speaker 1: we're gonna see autonomous vehicles offered up for private ownership 694 00:42:43,280 --> 00:42:46,600 Speaker 1: right away for the most part, due to the prohibitive 695 00:42:46,600 --> 00:42:51,560 Speaker 1: expense of this additional technology. The Google's experiment pointed out 696 00:42:51,640 --> 00:42:54,920 Speaker 1: both the advances of the tech and the limitations of 697 00:42:54,960 --> 00:42:59,479 Speaker 1: autonomous car technology. Yeah, the car had no controls, which 698 00:42:59,520 --> 00:43:01,840 Speaker 1: is what you would expect only if you had a 699 00:43:01,960 --> 00:43:06,160 Speaker 1: level five autonomous car. But it also had very strict 700 00:43:06,280 --> 00:43:10,920 Speaker 1: geo fencing restrictions and operational restrictions, so it couldn't go 701 00:43:11,040 --> 00:43:14,040 Speaker 1: very fast, it couldn't venture very far, it wouldn't likely 702 00:43:14,239 --> 00:43:18,520 Speaker 1: encounter unusual situations. So because of that, it wouldn't be 703 00:43:18,600 --> 00:43:23,839 Speaker 1: Level five anyway, because you've you've limited the scenarios where 704 00:43:23,840 --> 00:43:26,319 Speaker 1: it would be operating in the first place, it would 705 00:43:26,320 --> 00:43:29,000 Speaker 1: not be driving into all the different situations that a 706 00:43:29,080 --> 00:43:33,120 Speaker 1: human driver would encounter. A truly autonomous vehicle would need 707 00:43:33,160 --> 00:43:37,320 Speaker 1: to be able to handle everything, all sorts of unpredictable situations. 708 00:43:37,360 --> 00:43:40,759 Speaker 1: The average person isn't likely to encounter a truly unusual 709 00:43:40,840 --> 00:43:45,120 Speaker 1: experience on any given drive, right, It's not like if 710 00:43:45,160 --> 00:43:47,400 Speaker 1: you drive down the road you're going to see every 711 00:43:47,440 --> 00:43:52,880 Speaker 1: single outlier. That's very unlikely. However, when you have a 712 00:43:52,920 --> 00:43:57,600 Speaker 1: collective three trillion vehicle miles traveled per year, you're bound 713 00:43:57,640 --> 00:44:03,240 Speaker 1: to get some pretty extreme situations somewhere in those three 714 00:44:03,239 --> 00:44:05,520 Speaker 1: trillion miles. So you might have a person who has 715 00:44:05,640 --> 00:44:09,400 Speaker 1: to drive through a dangerous environment, like maybe mud slides 716 00:44:09,520 --> 00:44:12,640 Speaker 1: are coming across a road, or when people were evacuating 717 00:44:12,680 --> 00:44:15,920 Speaker 1: parts of California that were affected by wildfires, or there 718 00:44:16,000 --> 00:44:19,120 Speaker 1: might be you know, animals in the road. There could 719 00:44:19,120 --> 00:44:23,280 Speaker 1: be people in the road. Weather effects can be unpredictable, 720 00:44:23,320 --> 00:44:26,680 Speaker 1: and they can change driving conditions rapidly. There are all 721 00:44:26,719 --> 00:44:30,800 Speaker 1: sorts of things that humans encounter every year, with varying 722 00:44:30,800 --> 00:44:34,799 Speaker 1: degrees of success and maneuvering around or through them. And 723 00:44:34,840 --> 00:44:37,799 Speaker 1: if we actually do see autonomous cars take up more 724 00:44:37,880 --> 00:44:40,919 Speaker 1: of the car landscape, those autonomous cars are also going 725 00:44:40,960 --> 00:44:43,799 Speaker 1: to encounter those situations too. It's just a matter of 726 00:44:43,840 --> 00:44:46,799 Speaker 1: the odds, you know. And there are a lot of 727 00:44:46,840 --> 00:44:49,959 Speaker 1: unanswered questions about how these cars are going to deal 728 00:44:50,080 --> 00:44:53,480 Speaker 1: with those situations when they arise, and that includes the 729 00:44:53,480 --> 00:44:57,480 Speaker 1: famous trolley problem dilemma. Now, in the classic trolley problem, 730 00:44:57,520 --> 00:45:00,880 Speaker 1: you're presented with a hypothetical situation in which a trolley 731 00:45:00,920 --> 00:45:03,680 Speaker 1: is out of control. It's moving down the tracks, uh 732 00:45:03,719 --> 00:45:07,440 Speaker 1: and it cannot stop. So if you do nothing, if 733 00:45:07,440 --> 00:45:10,440 Speaker 1: you do not act, the trolley will continue down the 734 00:45:10,480 --> 00:45:13,000 Speaker 1: track and it's going to hit a group of five people. 735 00:45:13,080 --> 00:45:17,520 Speaker 1: It's gonna there's no doubt it will kill those five people. However, 736 00:45:17,719 --> 00:45:19,919 Speaker 1: there next to a lever, and if you pull that lever, 737 00:45:20,400 --> 00:45:23,319 Speaker 1: you will send the trolley down a side track, so 738 00:45:23,400 --> 00:45:25,759 Speaker 1: it will miss the five people, but it will definitely 739 00:45:25,920 --> 00:45:29,080 Speaker 1: hit and kill one person. So if you do nothing, 740 00:45:29,160 --> 00:45:32,480 Speaker 1: five people die, But if you act, one person dies. 741 00:45:32,960 --> 00:45:35,680 Speaker 1: So does making the choice to pull the lever amount 742 00:45:35,719 --> 00:45:39,200 Speaker 1: to murdering that one person? Did you just choose to 743 00:45:39,360 --> 00:45:43,240 Speaker 1: kill that person. Does doing nothing mean that you've murdered 744 00:45:43,280 --> 00:45:45,480 Speaker 1: five people or does it just mean that you allowed 745 00:45:45,520 --> 00:45:48,360 Speaker 1: five people to die? Is there any meaningful difference between 746 00:45:48,360 --> 00:45:51,480 Speaker 1: those two things. Well, these are all questions and ethics, 747 00:45:51,520 --> 00:45:55,200 Speaker 1: but with autonomous cars it gets into less hypothetical territory. 748 00:45:55,239 --> 00:45:58,120 Speaker 1: You have to actually start to answer these questions. Cars 749 00:45:58,200 --> 00:46:01,399 Speaker 1: may very well encounter, since you, ations in which there 750 00:46:01,520 --> 00:46:05,960 Speaker 1: is no way to avoid injuring or killing someone. So 751 00:46:06,000 --> 00:46:09,759 Speaker 1: in those cases, what do the cars do you know who? 752 00:46:09,800 --> 00:46:13,200 Speaker 1: How do the cars choose which person is to be 753 00:46:13,280 --> 00:46:16,640 Speaker 1: put at risk? How do they decide what action to take? 754 00:46:16,719 --> 00:46:18,880 Speaker 1: Do they try to protect the people who are inside 755 00:46:18,880 --> 00:46:22,200 Speaker 1: the car at all costs, so in other words, yeah, 756 00:46:22,239 --> 00:46:24,319 Speaker 1: we're gonna make this decision which will protect the people 757 00:46:24,320 --> 00:46:27,680 Speaker 1: who are inside the car. By anyone else there they 758 00:46:27,680 --> 00:46:30,040 Speaker 1: are fair game. Or do they try to protect people 759 00:46:30,040 --> 00:46:32,560 Speaker 1: who are outside the car who maybe don't have the 760 00:46:32,600 --> 00:46:36,520 Speaker 1: benefit of the car's other safety features. Maybe you build 761 00:46:36,560 --> 00:46:39,680 Speaker 1: it into an autonomous car that the people inside the 762 00:46:39,680 --> 00:46:43,760 Speaker 1: car are allowed to encounter a bit more risk because 763 00:46:43,840 --> 00:46:46,560 Speaker 1: your thought is, well, the inside the car is very safe, 764 00:46:46,880 --> 00:46:48,880 Speaker 1: so we want to make sure we protect say a 765 00:46:48,920 --> 00:46:52,040 Speaker 1: pedestrian or bicyclist. We don't want the car to hit 766 00:46:52,120 --> 00:46:55,239 Speaker 1: them because they will suffer way more damage than the 767 00:46:55,239 --> 00:46:58,160 Speaker 1: people inside would. So we're going to make that decision. 768 00:46:58,200 --> 00:47:00,560 Speaker 1: That's a that's a possible choice too, But these are 769 00:47:00,600 --> 00:47:03,400 Speaker 1: not necessarily answered questions. There are questions that are being 770 00:47:03,520 --> 00:47:07,480 Speaker 1: answered as people are designing these vehicles. One benefit that 771 00:47:07,560 --> 00:47:12,120 Speaker 1: autonomous cars might have is that organizations overseeing them could, 772 00:47:12,200 --> 00:47:16,040 Speaker 1: at least in theory, use the collective information across an 773 00:47:16,160 --> 00:47:20,319 Speaker 1: entire fleet of autonomous cars to improve performance of each 774 00:47:20,400 --> 00:47:24,239 Speaker 1: vehicle within that fleet. So if one car were to 775 00:47:24,360 --> 00:47:28,800 Speaker 1: encounter a really unusual experience, engineers could take the data 776 00:47:28,880 --> 00:47:32,279 Speaker 1: from that experience and tweak the behavior of all the 777 00:47:32,320 --> 00:47:37,440 Speaker 1: cars across the fleet. So when one individual encounters something new, 778 00:47:38,080 --> 00:47:41,680 Speaker 1: everyone learns from that experience. So it's sort of like 779 00:47:41,719 --> 00:47:45,560 Speaker 1: the borg in Star Trek. It's a collective and that's 780 00:47:45,560 --> 00:47:48,319 Speaker 1: a big advantage over human beings, right because when it 781 00:47:48,320 --> 00:47:52,759 Speaker 1: comes to humans, the person who experienced something, they might 782 00:47:52,920 --> 00:47:56,760 Speaker 1: learn from that experience, but that that learning, that knowledge 783 00:47:57,080 --> 00:48:02,080 Speaker 1: doesn't automatically spread across the population and general So in 784 00:48:02,120 --> 00:48:04,640 Speaker 1: that way, autonomous cars can have a big advantage over 785 00:48:04,800 --> 00:48:09,320 Speaker 1: human drivers. If that is used properly on the flip side, 786 00:48:09,520 --> 00:48:12,800 Speaker 1: when it comes to something as potentially deadly as a vehicle, 787 00:48:13,080 --> 00:48:15,960 Speaker 1: it's pretty cavalier to say, well, the cars will learn 788 00:48:16,040 --> 00:48:18,440 Speaker 1: as they go, and we'll apply that knowledge to all 789 00:48:18,520 --> 00:48:22,080 Speaker 1: the vehicles. They'll get better the longer they drive, because 790 00:48:22,120 --> 00:48:27,680 Speaker 1: if learning also includes accidents that could potentially result in 791 00:48:27,840 --> 00:48:31,840 Speaker 1: injuries or fatalities, that's a really steep price to pay 792 00:48:31,880 --> 00:48:35,640 Speaker 1: for knowledge. And we're seeing more companies developed vehicles that 793 00:48:35,960 --> 00:48:39,279 Speaker 1: have no manual control systems. You know, Google came out 794 00:48:39,280 --> 00:48:42,680 Speaker 1: with There's in but that's not the only case of it. 795 00:48:42,719 --> 00:48:46,600 Speaker 1: In January, g MS Autonomous Car division, which is called Cruise. 796 00:48:46,960 --> 00:48:50,160 Speaker 1: Originally it was an independent startup, but GM gobbled them up. 797 00:48:50,239 --> 00:48:53,120 Speaker 1: A couple of years ago, they unveiled a driverless car 798 00:48:53,239 --> 00:48:57,200 Speaker 1: called Origin. And the Origin, like Google's prototype, has no 799 00:48:57,360 --> 00:49:00,839 Speaker 1: steering wheel, has no accelerator, no brake pedal. It has 800 00:49:01,080 --> 00:49:03,719 Speaker 1: seats that all face inward. They're kind of like, you know, 801 00:49:03,760 --> 00:49:07,759 Speaker 1: think imagine two benches with with backs, but the two 802 00:49:07,760 --> 00:49:10,640 Speaker 1: benches are facing each other, so the people sitting in 803 00:49:10,680 --> 00:49:13,160 Speaker 1: what would consider to be the front of the vehicle 804 00:49:13,360 --> 00:49:15,880 Speaker 1: would have their backs to the windshield and they'd be 805 00:49:15,960 --> 00:49:18,400 Speaker 1: looking back at the people sitting in the back seats, 806 00:49:18,520 --> 00:49:22,120 Speaker 1: who'll be looking forward. Uh. Now, it's about the size 807 00:49:22,160 --> 00:49:24,680 Speaker 1: of a crossover suv, and that means there's a pretty 808 00:49:24,680 --> 00:49:27,560 Speaker 1: good amount of space inside the vehicle. So while you 809 00:49:27,600 --> 00:49:30,440 Speaker 1: are facing the other folks, like if if you're in 810 00:49:30,480 --> 00:49:32,120 Speaker 1: the front seat, you're facing the folks in the back 811 00:49:32,120 --> 00:49:35,000 Speaker 1: and they're facing you. Because there's so much space, you're 812 00:49:35,040 --> 00:49:37,319 Speaker 1: not likely to accidentally kick each other or anything. It 813 00:49:37,360 --> 00:49:40,160 Speaker 1: looks pretty roomy. On top of that, the car has 814 00:49:40,640 --> 00:49:43,720 Speaker 1: a cool little keypad on the doors. And the idea 815 00:49:43,760 --> 00:49:46,440 Speaker 1: is that a production model of this car would be 816 00:49:46,560 --> 00:49:49,080 Speaker 1: used like a robotaxi. So you would hail a ride 817 00:49:49,080 --> 00:49:52,399 Speaker 1: on your smartphone and this little robo car would come 818 00:49:52,520 --> 00:49:55,200 Speaker 1: driving up to you, and then it would give you 819 00:49:55,400 --> 00:49:58,120 Speaker 1: a multi digit pass code. You would get one on 820 00:49:58,160 --> 00:50:00,319 Speaker 1: your app and you would look at that ask God, 821 00:50:00,320 --> 00:50:02,880 Speaker 1: and you would type the numbers into the keypad and 822 00:50:02,920 --> 00:50:05,359 Speaker 1: that would open the doors. So that way, you know, 823 00:50:05,640 --> 00:50:08,799 Speaker 1: some unauthorized person wouldn't just jump into your car and 824 00:50:08,800 --> 00:50:11,960 Speaker 1: then go gallivanting off without you. You would be able 825 00:50:12,000 --> 00:50:15,040 Speaker 1: to unlock the car yourself because you had a one 826 00:50:15,080 --> 00:50:18,680 Speaker 1: time use key code. That's a decent concept for a 827 00:50:18,719 --> 00:50:22,000 Speaker 1: working robotaxi, but the fact remains that we haven't hit 828 00:50:22,160 --> 00:50:26,480 Speaker 1: level five autonomy yet. At best, we have limited level four. 829 00:50:26,760 --> 00:50:29,840 Speaker 1: Most of the vehicles we've seen in testing can perform autonomously, 830 00:50:30,080 --> 00:50:34,200 Speaker 1: but only with pretty tight restrictions like that along specific 831 00:50:34,239 --> 00:50:38,960 Speaker 1: predefined routes or within very strict geo fencing, or at 832 00:50:38,960 --> 00:50:42,200 Speaker 1: particular times of the year or even particular times of 833 00:50:42,239 --> 00:50:45,680 Speaker 1: the day. Again, that helps reduce the variables that the 834 00:50:45,680 --> 00:50:48,399 Speaker 1: car mine encounter on any given day, and it gives 835 00:50:48,400 --> 00:50:51,280 Speaker 1: it the best chance to operate safely, but that really 836 00:50:51,360 --> 00:50:55,680 Speaker 1: limits how useful the cars are in practical applications. For 837 00:50:55,719 --> 00:50:59,280 Speaker 1: autonomous cars to work as an alternative to manually controlled vehicles, 838 00:50:59,480 --> 00:51:01,359 Speaker 1: they need to brand and pretty much all the same 839 00:51:01,360 --> 00:51:05,239 Speaker 1: conditions that regular cars do without restrictions, and we just 840 00:51:05,400 --> 00:51:08,800 Speaker 1: aren't there yet, and we might not be for several 841 00:51:08,880 --> 00:51:14,719 Speaker 1: more decades. The Prognos Research Institute actually identified four factors 842 00:51:14,760 --> 00:51:17,680 Speaker 1: that are in the way of autonomous vehicles. They include 843 00:51:17,760 --> 00:51:20,600 Speaker 1: technological maturity, which is what I was just talking about. 844 00:51:21,000 --> 00:51:26,520 Speaker 1: Infrastructure development, so having you know, cities that are designed 845 00:51:26,560 --> 00:51:29,320 Speaker 1: in such a way that they can allow for autonomous 846 00:51:29,360 --> 00:51:33,200 Speaker 1: cars the inertia of the fleet. This means that you know, 847 00:51:33,280 --> 00:51:36,040 Speaker 1: there's a ton of manually controlled vehicles out in the 848 00:51:36,080 --> 00:51:39,319 Speaker 1: world already, right The vast majority of cars that are 849 00:51:39,320 --> 00:51:43,000 Speaker 1: out there are manual control vehicles. They might have some 850 00:51:43,120 --> 00:51:45,760 Speaker 1: limited autonomy, but for the most part, they're controlled by humans. 851 00:51:46,000 --> 00:51:49,200 Speaker 1: It would take a very long time before autonomous vehicles 852 00:51:49,239 --> 00:51:53,640 Speaker 1: represent a significant percentage of the overall vehicles on the road, 853 00:51:54,320 --> 00:51:57,640 Speaker 1: let alone a majority. So it will take many, many, 854 00:51:57,719 --> 00:52:02,040 Speaker 1: many years to wean off of a human controlled cars 855 00:52:02,040 --> 00:52:06,040 Speaker 1: and go to autonomous cars, barring any legislation that outlaws 856 00:52:06,120 --> 00:52:10,000 Speaker 1: vehicles um or human controlled vehicles, I guess I should say. 857 00:52:10,120 --> 00:52:13,920 Speaker 1: And then finally we have legal hurdles to overcome the 858 00:52:14,000 --> 00:52:17,680 Speaker 1: regulations that are going to be coming out around driverless cars. 859 00:52:17,880 --> 00:52:21,040 Speaker 1: We're seeing a lot of money poured into research and 860 00:52:21,080 --> 00:52:25,880 Speaker 1: development to push the technological limits further and to establish 861 00:52:25,960 --> 00:52:29,800 Speaker 1: the foundation for truly autonomous vehicles. But I wonder if 862 00:52:29,840 --> 00:52:33,520 Speaker 1: these various companies and their investors are really in it 863 00:52:33,560 --> 00:52:36,440 Speaker 1: for the long haul, so to speak, because I suspect 864 00:52:36,480 --> 00:52:38,560 Speaker 1: it's going to take a pretty long time to get 865 00:52:38,560 --> 00:52:42,240 Speaker 1: to a point where we feel there's really reliable safe 866 00:52:42,320 --> 00:52:45,480 Speaker 1: level five autonomous vehicles in the world, let alone a 867 00:52:45,520 --> 00:52:49,520 Speaker 1: world in which governments have also agreed and have caught 868 00:52:49,600 --> 00:52:52,960 Speaker 1: up and have defined the legal parameters for the operation 869 00:52:53,000 --> 00:52:55,319 Speaker 1: of these vehicles. Because you know, it's one thing to 870 00:52:55,440 --> 00:52:58,759 Speaker 1: prove the technology works. That doesn't necessarily mean that technology 871 00:52:58,800 --> 00:53:02,600 Speaker 1: will be legal to to operate, right Like, governments tend 872 00:53:02,640 --> 00:53:05,359 Speaker 1: to move a lot more slowly than technology does. So 873 00:53:05,520 --> 00:53:08,919 Speaker 1: if investors are willing to play the long game, then 874 00:53:09,080 --> 00:53:12,560 Speaker 1: I think their investments will ultimately pay off. But it's 875 00:53:12,600 --> 00:53:16,480 Speaker 1: going to take a long time, which means lots of 876 00:53:16,520 --> 00:53:19,200 Speaker 1: repeated investments are going to be required to keep these 877 00:53:19,200 --> 00:53:24,240 Speaker 1: companies going, to keep them innovating and improving technologies. And meanwhile, 878 00:53:24,280 --> 00:53:26,600 Speaker 1: there's not going to be an actual market for them 879 00:53:26,600 --> 00:53:30,480 Speaker 1: to capitalize on outside a few, you know, test programs 880 00:53:30,640 --> 00:53:33,759 Speaker 1: that don't really count, because there's no way that the 881 00:53:33,840 --> 00:53:38,480 Speaker 1: revenue they're generating is actually eclipsing the cost of operation. 882 00:53:39,200 --> 00:53:41,680 Speaker 1: It's got to be a money losing proposition right now 883 00:53:41,719 --> 00:53:46,880 Speaker 1: in all the different test cases at scale, with fully 884 00:53:47,040 --> 00:53:51,560 Speaker 1: legal vehicles that are embraced by the general public shore 885 00:53:51,800 --> 00:53:55,560 Speaker 1: it could work from a financial standpoint right now, though 886 00:53:55,880 --> 00:54:00,239 Speaker 1: it's all just proof of concept that hasn't uh seem 887 00:54:00,280 --> 00:54:04,600 Speaker 1: full fruition. Now. I still believe in autonomous cars. I 888 00:54:04,640 --> 00:54:08,160 Speaker 1: still believe they will ultimately make the roads safer and 889 00:54:08,200 --> 00:54:10,960 Speaker 1: reduce the number of deaths and injuries from car accidents. 890 00:54:11,360 --> 00:54:13,480 Speaker 1: I just think it's going to take a lot longer 891 00:54:13,680 --> 00:54:18,080 Speaker 1: that I had previously imagined. And that's not necessarily a 892 00:54:18,160 --> 00:54:21,880 Speaker 1: bad thing. This isn't important enough issue that we have 893 00:54:22,040 --> 00:54:24,319 Speaker 1: to make sure we get it right that we can 894 00:54:24,360 --> 00:54:28,320 Speaker 1: deploy vehicles in ways that makes sense, that are truly safe, 895 00:54:28,360 --> 00:54:32,520 Speaker 1: that are ethical, that there are in as an ideal 896 00:54:32,560 --> 00:54:37,160 Speaker 1: implementation as we can manage UH. And we have to 897 00:54:37,200 --> 00:54:40,360 Speaker 1: make sure that it makes financial sense too, right. We 898 00:54:40,400 --> 00:54:43,160 Speaker 1: need to have h make sure that it truly represents 899 00:54:43,200 --> 00:54:45,719 Speaker 1: an affordable way to get around that eliminates the need 900 00:54:45,760 --> 00:54:49,960 Speaker 1: for stuff like garages and parking lots and dense urban centers. 901 00:54:50,480 --> 00:54:53,560 Speaker 1: Those areas could be reclaimed and used for other stuff, 902 00:54:54,040 --> 00:54:56,799 Speaker 1: and that stuff might be far more productive than just 903 00:54:56,920 --> 00:54:59,319 Speaker 1: being a storage place for a car when it's not 904 00:54:59,360 --> 00:55:02,520 Speaker 1: being in use. Personal ownership could really be on a 905 00:55:02,600 --> 00:55:05,759 Speaker 1: serious decline in that kind of future, replaced with on 906 00:55:05,920 --> 00:55:09,200 Speaker 1: demand car service, and the cars that are in service 907 00:55:09,239 --> 00:55:11,840 Speaker 1: would be used much more frequently rather than just sitting 908 00:55:11,880 --> 00:55:14,000 Speaker 1: idle and taking up space for the vast majority of 909 00:55:14,040 --> 00:55:17,360 Speaker 1: their existence. If you think about your average car um 910 00:55:17,400 --> 00:55:20,920 Speaker 1: it's the amount of time you're actually using it versus 911 00:55:20,920 --> 00:55:23,240 Speaker 1: the amount of time it's just sitting there doing nothing 912 00:55:23,760 --> 00:55:26,880 Speaker 1: is staggering, right. So if you're able to make more 913 00:55:27,080 --> 00:55:31,400 Speaker 1: use of the vehicle, uh, then it's a more efficient 914 00:55:31,560 --> 00:55:34,839 Speaker 1: use of the technology. It's a it's a better investment 915 00:55:35,000 --> 00:55:39,960 Speaker 1: for all the the materials that went into making that vehicle. 916 00:55:40,520 --> 00:55:43,319 Speaker 1: So you could argue, well, this makes more sense from 917 00:55:43,400 --> 00:55:46,840 Speaker 1: multiple perspectives if we're able to make better use of 918 00:55:46,840 --> 00:55:49,520 Speaker 1: this technology and not just have it sitting someplace taking 919 00:55:49,600 --> 00:55:53,960 Speaker 1: up room. But it's it's a lot a lot of 920 00:55:53,960 --> 00:55:55,520 Speaker 1: things have to fall into place for that future to 921 00:55:55,600 --> 00:56:01,200 Speaker 1: come true. I think it's a future that it makes sense, 922 00:56:01,520 --> 00:56:03,640 Speaker 1: but only if we can get the tech just right 923 00:56:03,880 --> 00:56:08,279 Speaker 1: and before then, what we're really risking is making bad 924 00:56:08,360 --> 00:56:12,480 Speaker 1: decisions that just make it harder to get to the 925 00:56:12,600 --> 00:56:15,600 Speaker 1: right future. So we have to be careful in how 926 00:56:15,640 --> 00:56:18,959 Speaker 1: we're testing these things. We have to minimize risk while 927 00:56:19,040 --> 00:56:24,080 Speaker 1: maximizing our our ability to learn things, which is a 928 00:56:24,200 --> 00:56:26,600 Speaker 1: very tricky thing to do because Ultimately, you do have 929 00:56:26,680 --> 00:56:31,920 Speaker 1: to start deploying autonomous cars into populated centers or else. 930 00:56:32,080 --> 00:56:34,480 Speaker 1: All you've done is created something that works really well 931 00:56:34,480 --> 00:56:37,560 Speaker 1: in the lab, but not well in the real world, 932 00:56:37,760 --> 00:56:40,120 Speaker 1: and that would be useless to us because most of 933 00:56:40,160 --> 00:56:42,640 Speaker 1: us don't live in a lab. I know I don't, 934 00:56:43,640 --> 00:56:48,840 Speaker 1: not since two thousand fourteen, but that's another story. Guys. 935 00:56:48,920 --> 00:56:52,760 Speaker 1: If you have any suggestions for future topics for tech Stuff, 936 00:56:53,040 --> 00:56:56,200 Speaker 1: reach out to me. You can find me on social media. 937 00:56:56,280 --> 00:56:59,560 Speaker 1: I'm on Facebook and on Twitter with the handle tech 938 00:56:59,640 --> 00:57:03,320 Speaker 1: stuff h s W and I'll talk to you again 939 00:57:04,040 --> 00:57:10,520 Speaker 1: really soon. Hext Stuff is a production of I Heart 940 00:57:10,600 --> 00:57:14,000 Speaker 1: Radio's How Stuff Works. For more podcasts from I heart Radio, 941 00:57:14,320 --> 00:57:17,520 Speaker 1: visit the I heart Radio app, Apple Podcasts, or wherever 942 00:57:17,600 --> 00:57:19,120 Speaker 1: you listen to your favorite shows.