1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:14,080 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,120 --> 00:00:17,360 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,360 --> 00:00:19,840 Speaker 1: How Stuff Works in iHeart radio and I love all 5 00:00:20,040 --> 00:00:23,479 Speaker 1: things tech. And in our last episode, I talked about 6 00:00:23,520 --> 00:00:27,720 Speaker 1: the two thousand seven Urban Challenge as the autonomous car 7 00:00:27,880 --> 00:00:31,680 Speaker 1: competition that DARPA held, and they were specifically trying to 8 00:00:31,720 --> 00:00:35,800 Speaker 1: spur on innovation in driver less car technology. One of 9 00:00:35,800 --> 00:00:39,720 Speaker 1: the elements I didn't really talk about was how cooperative 10 00:00:39,920 --> 00:00:43,000 Speaker 1: the experience was. I mentioned that there was sort of 11 00:00:43,000 --> 00:00:46,080 Speaker 1: this air of cooperation, and I talked about how Dave 12 00:00:46,159 --> 00:00:50,559 Speaker 1: Hall had created a lidar tool that ended up being 13 00:00:50,680 --> 00:00:54,920 Speaker 1: used by lots of the teams, but it went well 14 00:00:54,960 --> 00:00:59,000 Speaker 1: beyond that. Teams were eager to share their approaches and 15 00:00:59,040 --> 00:01:02,480 Speaker 1: their technologies each other, the algorithms they were using for 16 00:01:03,240 --> 00:01:07,600 Speaker 1: decision making processes. There was a lot of excitement as 17 00:01:07,640 --> 00:01:11,160 Speaker 1: these different, very intelligent groups of people got together and 18 00:01:11,200 --> 00:01:15,720 Speaker 1: began to cross pollinate their ideas. The real goal wasn't 19 00:01:15,720 --> 00:01:19,840 Speaker 1: necessarily to beat out everyone else, although of course everyone 20 00:01:19,840 --> 00:01:22,600 Speaker 1: would have liked to have been on the winning team. 21 00:01:23,160 --> 00:01:26,679 Speaker 1: But really the real goal was to overcome this huge 22 00:01:26,840 --> 00:01:31,560 Speaker 1: engineering challenge of developing technologies that would be necessary for 23 00:01:31,600 --> 00:01:35,240 Speaker 1: a car to maneuver through an urban environment safely under 24 00:01:35,280 --> 00:01:39,920 Speaker 1: its own power, following all applicable traffic laws, and integrating 25 00:01:39,959 --> 00:01:45,000 Speaker 1: with human driven vehicles in that space. Some people like 26 00:01:45,240 --> 00:01:48,760 Speaker 1: Sebastian Throne of Stanford Racing Team, would say that the 27 00:01:48,920 --> 00:01:52,720 Speaker 1: greatest achievement to come out of the challenges wasn't winning 28 00:01:52,840 --> 00:01:57,480 Speaker 1: top prize. It was when teams would share their knowledge 29 00:01:57,800 --> 00:02:01,000 Speaker 1: and their experience with one another, allowing separate lines of 30 00:02:01,040 --> 00:02:04,360 Speaker 1: research and development to start to converge, and it's set 31 00:02:04,400 --> 00:02:07,960 Speaker 1: the ground for what would come next. Now, the challenges 32 00:02:08,000 --> 00:02:10,560 Speaker 1: in two thousand four, two thousand five, and two thousand 33 00:02:10,680 --> 00:02:15,400 Speaker 1: seven proved that autonomous cars could exist on some level, 34 00:02:15,800 --> 00:02:18,320 Speaker 1: and a lot of the technology that went into making 35 00:02:18,360 --> 00:02:23,679 Speaker 1: cars navigate and operate independently would go into other systems 36 00:02:23,680 --> 00:02:28,480 Speaker 1: and other applications, stuff like various driver assist technologies like 37 00:02:28,600 --> 00:02:32,799 Speaker 1: lane detection or automatic breaking. And it also encouraged some 38 00:02:32,840 --> 00:02:36,000 Speaker 1: big companies to continue supporting efforts to making driver lest 39 00:02:36,040 --> 00:02:40,320 Speaker 1: cars a reality. So one thing to acknowledge is, even 40 00:02:40,360 --> 00:02:42,680 Speaker 1: if we were to say we're never going to be 41 00:02:42,919 --> 00:02:47,520 Speaker 1: at a time where truly autonomous cars, ones that could 42 00:02:47,520 --> 00:02:51,320 Speaker 1: take us from any starting location to Indy, any ending 43 00:02:51,400 --> 00:02:55,959 Speaker 1: location that's connected by roads to our starting point. Even 44 00:02:55,960 --> 00:02:58,000 Speaker 1: if we say that's never going to be possible, not 45 00:02:58,120 --> 00:03:02,919 Speaker 1: truly rely Apple, what we can say is we've already 46 00:03:02,919 --> 00:03:07,840 Speaker 1: seen numerous technologies that were spawned by these competitions that 47 00:03:08,000 --> 00:03:11,120 Speaker 1: are we're not only going into cars today, but are 48 00:03:11,200 --> 00:03:16,880 Speaker 1: saving lives in the process. So that's that's already a 49 00:03:16,960 --> 00:03:22,520 Speaker 1: great outcome to this DARPA challenge, even if you, you know, 50 00:03:22,720 --> 00:03:27,960 Speaker 1: again agree that ultimately this was to make more more 51 00:03:28,280 --> 00:03:32,760 Speaker 1: automated military ground combat vehicles, and even in that case 52 00:03:33,400 --> 00:03:37,680 Speaker 1: that was intended to help save soldiers lives. So that's 53 00:03:37,720 --> 00:03:41,440 Speaker 1: a noble endeavor as well. But let's talk about some 54 00:03:41,480 --> 00:03:44,960 Speaker 1: of the companies and organizations that formed in the wake 55 00:03:45,400 --> 00:03:48,320 Speaker 1: of these grand challenges. You had General Motors and Carnegie 56 00:03:48,360 --> 00:03:51,800 Speaker 1: Mellon University get together. They launched a research and development 57 00:03:51,800 --> 00:03:56,800 Speaker 1: program called the Autonomous Driving Collaborative Research Lab. Then you 58 00:03:56,840 --> 00:04:01,480 Speaker 1: also had Volkswagen establish a similar effort with stand For University. Now, 59 00:04:01,480 --> 00:04:04,960 Speaker 1: those academic research programs didn't receive nearly as much public 60 00:04:05,000 --> 00:04:09,480 Speaker 1: attention as the other really big entity that got involved 61 00:04:09,480 --> 00:04:13,520 Speaker 1: in the field not long after the challenges, and that 62 00:04:13,560 --> 00:04:17,760 Speaker 1: would be Google, later known as Alphabet or way Mo. 63 00:04:18,279 --> 00:04:21,760 Speaker 1: So Alphabets, the parent companies, the holding company under which 64 00:04:22,320 --> 00:04:25,320 Speaker 1: UH the smaller companies and smaller by smaller, I just 65 00:04:25,400 --> 00:04:29,560 Speaker 1: mean hierarchically would spin off. So you have like Google, 66 00:04:29,800 --> 00:04:33,080 Speaker 1: which continues to focus on the core business of the 67 00:04:33,120 --> 00:04:37,200 Speaker 1: overall company, but you have these other subsidiaries that are 68 00:04:37,240 --> 00:04:40,760 Speaker 1: subsidiaries of Alphabet, not of Google anyway. One of those 69 00:04:40,760 --> 00:04:46,240 Speaker 1: would be a division that was specifically in research and 70 00:04:46,279 --> 00:04:49,640 Speaker 1: development in the area of autonomous cars. So in two 71 00:04:49,680 --> 00:04:53,279 Speaker 1: thousand seven, Sebastian Throne took a sabbatical from being a 72 00:04:53,279 --> 00:04:56,280 Speaker 1: professor at Stanford to go work at Google for a 73 00:04:56,279 --> 00:04:59,480 Speaker 1: short while. Turned out to be longer than a short while, 74 00:04:59,520 --> 00:05:03,159 Speaker 1: but he and a team were hired on essentially to 75 00:05:03,320 --> 00:05:06,800 Speaker 1: help develop Google's street view tool. That's the tool that's 76 00:05:06,839 --> 00:05:11,440 Speaker 1: integrated into various maps systems that Google does and allows 77 00:05:11,440 --> 00:05:15,360 Speaker 1: you to take a street level look at different locations. 78 00:05:15,880 --> 00:05:19,159 Speaker 1: It was done by driving special cars outfitted with special 79 00:05:19,200 --> 00:05:23,240 Speaker 1: cameras through all these different streets and all these different locations. 80 00:05:23,720 --> 00:05:27,080 Speaker 1: They had about a year UH scheduled to try and 81 00:05:27,160 --> 00:05:30,520 Speaker 1: map out all the roads they could in various major cities, 82 00:05:30,560 --> 00:05:32,560 Speaker 1: and they ended up doing it in nine months, so 83 00:05:32,600 --> 00:05:36,840 Speaker 1: it's pretty impressive anyway. That project has had plenty of 84 00:05:36,880 --> 00:05:40,760 Speaker 1: news coverage, not just because of its utility, but also 85 00:05:40,839 --> 00:05:44,680 Speaker 1: because of concerns about privacy and security. Not everyone is 86 00:05:44,720 --> 00:05:48,000 Speaker 1: super crazy about the idea of using an online tool 87 00:05:48,080 --> 00:05:51,440 Speaker 1: to virtually coast down streets and take a look at 88 00:05:52,080 --> 00:05:56,800 Speaker 1: different addresses, not to mention possibly spying people that you 89 00:05:56,920 --> 00:06:00,560 Speaker 1: recognize in places they should not be, but that's a 90 00:06:00,600 --> 00:06:04,400 Speaker 1: topic for another time. In two thousand nine, Google would 91 00:06:04,440 --> 00:06:09,200 Speaker 1: go a step further and secretly began testing autonomous vehicles. 92 00:06:09,240 --> 00:06:11,560 Speaker 1: So they had been in development of that for a 93 00:06:11,600 --> 00:06:13,760 Speaker 1: couple of years as well, and Sebastian Thron had worked 94 00:06:13,760 --> 00:06:19,280 Speaker 1: on that. They were retrofitting Toyota Prius cars at first, 95 00:06:19,720 --> 00:06:23,279 Speaker 1: and the goal was to conduct ten one hundred mile 96 00:06:23,560 --> 00:06:27,760 Speaker 1: trips in those cars without interruption, So uninterrupted one mile 97 00:06:27,839 --> 00:06:32,360 Speaker 1: trips with Toyota previous is ten times. Over the course 98 00:06:32,400 --> 00:06:36,320 Speaker 1: of two thousand nine, Google would rack up more autonomously 99 00:06:36,440 --> 00:06:40,600 Speaker 1: driven miles than had been accumulated over all previous years 100 00:06:40,600 --> 00:06:46,200 Speaker 1: of experimentation among all autonomous car programs, so in one 101 00:06:46,279 --> 00:06:50,680 Speaker 1: year they eclipsed everything that had come before it. In 102 00:06:51,760 --> 00:06:55,279 Speaker 1: Sebastian Throne would officially join Google as a Google Fellow 103 00:06:55,520 --> 00:06:58,960 Speaker 1: and found a secret research and development division within the 104 00:06:58,960 --> 00:07:04,120 Speaker 1: company called Goal X. Now, among the many projects that 105 00:07:04,160 --> 00:07:07,240 Speaker 1: division would focus on, the autonomous cars was just one 106 00:07:08,320 --> 00:07:11,960 Speaker 1: would be the self driving cars, and Thron, having participated 107 00:07:12,080 --> 00:07:15,240 Speaker 1: in all three of the DARPA challenges, had really deep 108 00:07:15,360 --> 00:07:18,120 Speaker 1: contacts in that field, and he could call upon them 109 00:07:18,440 --> 00:07:21,880 Speaker 1: to help solve difficult problems that remained in pursuit of 110 00:07:21,920 --> 00:07:25,640 Speaker 1: the goal of a truly autonomous car that could interoperate 111 00:07:25,720 --> 00:07:30,320 Speaker 1: with human society. So Thron ended up hiring people from 112 00:07:30,360 --> 00:07:34,040 Speaker 1: the various competing teams of the DARPA challenges to join 113 00:07:34,240 --> 00:07:38,280 Speaker 1: him at Google in developing further autonomous car technologies. Technically, 114 00:07:38,480 --> 00:07:41,840 Speaker 1: the company had been doing this since two thousand seven. Really, 115 00:07:42,840 --> 00:07:45,440 Speaker 1: Dave Hall, the guy who suited up lidar to make 116 00:07:45,480 --> 00:07:49,200 Speaker 1: it an indispensable tool, became a billionaire through his work 117 00:07:49,200 --> 00:07:52,080 Speaker 1: at Velo Dine. In the meantime, and read Whittaker over 118 00:07:52,120 --> 00:07:56,760 Speaker 1: at Carnegie, Melon would continue teaching at that university, effectively 119 00:07:56,800 --> 00:08:00,920 Speaker 1: training the next generation of roboticists and compute of scientists 120 00:08:00,920 --> 00:08:05,680 Speaker 1: throughout the entire industry were continuing to develop machine learning 121 00:08:05,760 --> 00:08:10,560 Speaker 1: strategies that would become useful in multiple applications, including teaching 122 00:08:10,560 --> 00:08:14,520 Speaker 1: a car how to drive itself. There were numerous groups 123 00:08:14,560 --> 00:08:17,720 Speaker 1: that thought, this is the real secret to developing a 124 00:08:17,760 --> 00:08:22,120 Speaker 1: truly autonomous car is through machine learning. Not programming a 125 00:08:22,120 --> 00:08:25,800 Speaker 1: car on what to do in any given situation, but 126 00:08:26,000 --> 00:08:29,600 Speaker 1: teaching a car, sort of akin to how you would 127 00:08:29,640 --> 00:08:33,760 Speaker 1: teach a young driver about how to conduct him or 128 00:08:33,800 --> 00:08:38,640 Speaker 1: herself in a car. By this stage, Google's tests had 129 00:08:38,720 --> 00:08:42,760 Speaker 1: drawn attention from the press. The New York Times published 130 00:08:42,760 --> 00:08:47,000 Speaker 1: a piece in October twenty ten titled Google Cars drive 131 00:08:47,080 --> 00:08:51,040 Speaker 1: themselves in Traffic. The article revealed that Google had been 132 00:08:51,040 --> 00:08:55,079 Speaker 1: conducting tests in plain site for several months, but at 133 00:08:55,120 --> 00:08:57,959 Speaker 1: that point had not commented on what those tests were 134 00:08:58,000 --> 00:09:01,880 Speaker 1: all about, and the company was content with people just 135 00:09:01,960 --> 00:09:05,600 Speaker 1: assuming that it was another Google street view car. There 136 00:09:05,600 --> 00:09:09,439 Speaker 1: were fifteen engineers working on the project, and they had 137 00:09:09,520 --> 00:09:13,360 Speaker 1: hired a dozen or so drivers whose job it was 138 00:09:13,400 --> 00:09:16,280 Speaker 1: to monitor the performance of the vehicles and to take 139 00:09:16,320 --> 00:09:20,800 Speaker 1: over if necessary. The engineers testing the vehicles had three 140 00:09:20,880 --> 00:09:25,240 Speaker 1: main ways that they could take control back from the car. 141 00:09:26,160 --> 00:09:30,240 Speaker 1: In each test vehicle, engineers had installed a button, a 142 00:09:30,360 --> 00:09:33,560 Speaker 1: nice candy like red button that was on the right 143 00:09:33,559 --> 00:09:36,679 Speaker 1: hand side of the driver so they could easily hit 144 00:09:36,720 --> 00:09:39,200 Speaker 1: the button and that would switch controls to manual. But 145 00:09:39,400 --> 00:09:40,840 Speaker 1: you can also do it if you tapped on the 146 00:09:40,880 --> 00:09:44,000 Speaker 1: brakes or if you turned the steering wheel, the car 147 00:09:44,080 --> 00:09:48,640 Speaker 1: would hand over a control to the driver. Google had 148 00:09:48,760 --> 00:09:52,880 Speaker 1: a little hiccup in two thousand eleven, a public hiccup 149 00:09:53,480 --> 00:09:55,560 Speaker 1: that was when one of its driver less vehicles was 150 00:09:55,600 --> 00:09:58,240 Speaker 1: involved in a low speed car crash, and at the 151 00:09:58,280 --> 00:10:03,240 Speaker 1: time everyone assumed this was the first uh in real 152 00:10:03,320 --> 00:10:07,840 Speaker 1: world car crash with an autonomous car. However, this particular 153 00:10:07,880 --> 00:10:11,080 Speaker 1: incident was only a very tiny hiccup because it soon 154 00:10:11,120 --> 00:10:15,120 Speaker 1: became public that the car, the autonomous car, was actually 155 00:10:15,200 --> 00:10:18,440 Speaker 1: in manual control mode at the time. The human driver 156 00:10:18,520 --> 00:10:23,200 Speaker 1: behind the wheel was responsible for the accident, not the 157 00:10:23,280 --> 00:10:27,720 Speaker 1: driver lest car technology. Google's driver lest cars would maintain 158 00:10:27,840 --> 00:10:36,040 Speaker 1: a perfect safety record in autonomous mode until publicly. That's 159 00:10:36,040 --> 00:10:39,240 Speaker 1: assuming if you say perfect safety record in the sense 160 00:10:39,320 --> 00:10:43,400 Speaker 1: that the autonomous vehicles were not found to be at fault. 161 00:10:43,640 --> 00:10:47,760 Speaker 1: There were incidents where autonomous cars were involved in car accidents, 162 00:10:47,800 --> 00:10:52,480 Speaker 1: but in every case until two thousand sixteen, every public 163 00:10:52,720 --> 00:10:57,959 Speaker 1: case it was determined that the other driver was at fault. Again, 164 00:10:58,559 --> 00:11:01,360 Speaker 1: stress on public. I'll back to what I mean in 165 00:11:01,400 --> 00:11:05,240 Speaker 1: just a few minutes. In two thousand twelve, Google began 166 00:11:05,320 --> 00:11:08,440 Speaker 1: to expand its fleet of driverless cars. It added a 167 00:11:08,559 --> 00:11:12,280 Speaker 1: Lexus r x f H to the mix, so it 168 00:11:12,320 --> 00:11:15,160 Speaker 1: wasn't just Toyota Prius is Uh. They had a couple 169 00:11:15,160 --> 00:11:17,840 Speaker 1: of others as well, and the company began to develop 170 00:11:17,920 --> 00:11:22,080 Speaker 1: its own sensors and began to replace the off the 171 00:11:22,120 --> 00:11:24,600 Speaker 1: shelf kind of stuff it was buying. I mean off 172 00:11:24,640 --> 00:11:26,679 Speaker 1: the shelf if you know which shelves to look at, 173 00:11:26,840 --> 00:11:31,640 Speaker 1: and still pretty exclusive materials, but now the company was 174 00:11:31,720 --> 00:11:35,959 Speaker 1: developing stuff in house purpose built for their own cars. 175 00:11:36,520 --> 00:11:39,160 Speaker 1: Two thousand twelve was also when a few Google employees 176 00:11:39,160 --> 00:11:42,600 Speaker 1: were allowed to start testing this technology on highways around 177 00:11:42,600 --> 00:11:46,640 Speaker 1: Google's campus, so it was outside of just the direct 178 00:11:46,800 --> 00:11:50,960 Speaker 1: team working on the project. Now other Google employees could 179 00:11:51,440 --> 00:11:54,480 Speaker 1: end up driving an autonomous car and allowing it to 180 00:11:54,559 --> 00:11:59,080 Speaker 1: operate an autonomous mode in specific regions and under specific 181 00:11:59,160 --> 00:12:02,040 Speaker 1: sets of circumstance answers. There are a lot of different 182 00:12:02,120 --> 00:12:05,040 Speaker 1: rules in place to participate in this, so you couldn't 183 00:12:05,080 --> 00:12:08,360 Speaker 1: just turn it on autonomous and sit back all the time. 184 00:12:08,840 --> 00:12:11,280 Speaker 1: And two thousand twelve was when the state of Nevada 185 00:12:11,360 --> 00:12:14,920 Speaker 1: made history by becoming the first state to license autonomous 186 00:12:14,920 --> 00:12:18,760 Speaker 1: cars for use on state roads. California would follow suit 187 00:12:19,000 --> 00:12:22,200 Speaker 1: that same year, but the bill that Governor Jerry Brown 188 00:12:22,480 --> 00:12:26,320 Speaker 1: had signed would only take effect starting in two thousand fifteen. 189 00:12:26,840 --> 00:12:30,920 Speaker 1: In January two thousand thirteen, Audi and Toyota both showed 190 00:12:30,920 --> 00:12:34,400 Speaker 1: off autonomous vehicle concept designs at c S, so it 191 00:12:34,480 --> 00:12:38,120 Speaker 1: showed that lots of UH entities were still very much 192 00:12:38,160 --> 00:12:41,880 Speaker 1: interested in driverless cars, not just Google. In two thousand 193 00:12:41,920 --> 00:12:45,520 Speaker 1: and fourteen, Google unveiled a prototype electric car that had 194 00:12:45,559 --> 00:12:48,400 Speaker 1: a top speed of twenty five miles per hour. There 195 00:12:48,480 --> 00:12:53,480 Speaker 1: was no steering wheel, no break, no accelerator, no controls 196 00:12:53,920 --> 00:12:56,839 Speaker 1: to allow a human driver to actually take manual control 197 00:12:56,880 --> 00:12:59,440 Speaker 1: of the car. Inside the car, the car did have 198 00:12:59,480 --> 00:13:01,959 Speaker 1: some control, but there were buttons that would tell it 199 00:13:02,040 --> 00:13:05,160 Speaker 1: when it could go and when it should stop. So 200 00:13:05,440 --> 00:13:08,760 Speaker 1: these vehicles were not intended for commercial use. They were 201 00:13:08,800 --> 00:13:11,400 Speaker 1: part of Google's R and D to to test out 202 00:13:11,480 --> 00:13:15,960 Speaker 1: the possibility of a vehicle that doesn't even have manual controls. 203 00:13:16,840 --> 00:13:20,520 Speaker 1: It requires a lot of trust be put into the system. Now. 204 00:13:20,520 --> 00:13:24,560 Speaker 1: To be fair, these were very limited in their scope. 205 00:13:24,920 --> 00:13:27,600 Speaker 1: Top speed of twenty five hour suggests that you would 206 00:13:27,720 --> 00:13:30,800 Speaker 1: use them on like residential streets and stuff. You wouldn't 207 00:13:31,440 --> 00:13:33,000 Speaker 1: use them on a highway. They wouldn't be able to 208 00:13:33,040 --> 00:13:35,800 Speaker 1: get up to speed. In two thousand and fourteen, the 209 00:13:35,920 --> 00:13:41,160 Speaker 1: state of California past legislation requiring any company operating autonomous 210 00:13:41,200 --> 00:13:45,000 Speaker 1: vehicles in the state to submit reports on any accident 211 00:13:45,120 --> 00:13:49,160 Speaker 1: involving a vehicle operating an autonomous mode that would result 212 00:13:49,200 --> 00:13:53,280 Speaker 1: in quote damage of property or in bodily injury or 213 00:13:53,440 --> 00:13:59,000 Speaker 1: death end quote. After that point, Google would report several accidents, 214 00:13:59,080 --> 00:14:03,079 Speaker 1: like more than three dozen. Most of those appeared to 215 00:14:03,080 --> 00:14:06,480 Speaker 1: be the fault of human drivers, not the autonomous systems. 216 00:14:06,840 --> 00:14:10,120 Speaker 1: But according to some Google executives, there were some accidents 217 00:14:10,160 --> 00:14:13,600 Speaker 1: that happened between two thousand eleven and when this piece 218 00:14:13,640 --> 00:14:17,520 Speaker 1: of legislation was signed in two thousand fourteen that Google 219 00:14:17,640 --> 00:14:22,800 Speaker 1: chose to keep quiet, and the logic that the company 220 00:14:22,880 --> 00:14:25,800 Speaker 1: used was that, well, those accidents happened before that law 221 00:14:25,920 --> 00:14:28,040 Speaker 1: was passed, so it doesn't really apply to those. We 222 00:14:28,040 --> 00:14:32,240 Speaker 1: don't need to talk about accidents that have already happened. 223 00:14:32,680 --> 00:14:35,480 Speaker 1: Just from this point forward, we'll talk about it. One 224 00:14:35,480 --> 00:14:39,960 Speaker 1: of those accidents involved someone who had participated in two 225 00:14:40,080 --> 00:14:43,520 Speaker 1: of the DARPA Grand Challenges, someone who was working for 226 00:14:43,600 --> 00:14:46,600 Speaker 1: Google and who had become something of a thorn in 227 00:14:46,640 --> 00:14:52,200 Speaker 1: the company's side, and that person was Anthony Lewandowski. I'll 228 00:14:52,240 --> 00:14:54,480 Speaker 1: explain more in a second, but first let's take a 229 00:14:54,560 --> 00:15:06,040 Speaker 1: quick break to thank our sponsor Lewandowski. If you listen 230 00:15:06,120 --> 00:15:08,800 Speaker 1: to the previous episodes, he was the guy who competed 231 00:15:09,160 --> 00:15:13,080 Speaker 1: using a self balancing motorcycle in the two thousand four 232 00:15:13,160 --> 00:15:16,200 Speaker 1: and two thousand five Grand Challenge competitions. The name of 233 00:15:16,200 --> 00:15:19,480 Speaker 1: that motorcycle, by the way, no big surprise, was ghost Writer. 234 00:15:20,280 --> 00:15:22,960 Speaker 1: He had joined Google in two thousand seven to work 235 00:15:22,960 --> 00:15:26,160 Speaker 1: with Sebastian Thrun on the Google street View project, and 236 00:15:26,240 --> 00:15:30,440 Speaker 1: Lewandowski started a few companies related to autonomous cars. He 237 00:15:30,840 --> 00:15:33,600 Speaker 1: had developed these technologies for ghost Writer, and then he 238 00:15:33,640 --> 00:15:38,080 Speaker 1: started some startups that were focused on specific elements that 239 00:15:38,160 --> 00:15:41,360 Speaker 1: he had created for a ghost Writer, including one that 240 00:15:41,600 --> 00:15:44,840 Speaker 1: used spinning light AR as a sensing technology. He didn't 241 00:15:45,480 --> 00:15:48,440 Speaker 1: invent that, but he did develop our start up a 242 00:15:48,480 --> 00:15:53,080 Speaker 1: company that specialized in that then he pushed Google to 243 00:15:53,280 --> 00:15:57,080 Speaker 1: buy the tech his other companies happened to be making, 244 00:15:57,680 --> 00:16:01,000 Speaker 1: which you might think is a little questionable. Lewandowski was 245 00:16:01,000 --> 00:16:05,040 Speaker 1: in a position to market his own company's products for 246 00:16:05,120 --> 00:16:07,800 Speaker 1: a project he was working on with Google, so he 247 00:16:07,840 --> 00:16:10,920 Speaker 1: was in charge of hardware for autonomous cars over at Google, 248 00:16:11,520 --> 00:16:13,720 Speaker 1: and then said, well, for us to outfit these cars, 249 00:16:13,800 --> 00:16:16,640 Speaker 1: let's buy this tech from this these two companies. Oh, 250 00:16:16,680 --> 00:16:19,280 Speaker 1: I happen to own those two companies, So I'm using 251 00:16:19,480 --> 00:16:22,600 Speaker 1: the money from the company I work for to funnel 252 00:16:22,640 --> 00:16:26,520 Speaker 1: into companies I own. People began to have questions about that, 253 00:16:27,920 --> 00:16:31,200 Speaker 1: but it totally worked. As people at Google learned about 254 00:16:31,240 --> 00:16:33,240 Speaker 1: what had gone on, they started to get a little 255 00:16:33,280 --> 00:16:37,440 Speaker 1: concerned about this and about Lewandowski. Uh. It got worse 256 00:16:37,520 --> 00:16:41,560 Speaker 1: when it became known that Lewandowski was talking to some competitors, 257 00:16:41,680 --> 00:16:44,800 Speaker 1: some other companies outside of Google about selling them the 258 00:16:44,880 --> 00:16:49,000 Speaker 1: same technology. This was complicated because he was operating those 259 00:16:49,040 --> 00:16:52,640 Speaker 1: businesses outside of Google. He was a business owner and 260 00:16:52,640 --> 00:16:54,840 Speaker 1: and those technically didn't belong to Google. But at the 261 00:16:54,840 --> 00:16:58,720 Speaker 1: same time he was working for Google, so it seemed 262 00:16:58,760 --> 00:17:03,480 Speaker 1: like he might be undercutting Google or helping out their competitors, 263 00:17:04,200 --> 00:17:09,040 Speaker 1: which was complicated. Larry Page, the head of Google, essentially 264 00:17:09,160 --> 00:17:14,200 Speaker 1: directed the company to acquire Lewandowski's businesses rather than risk 265 00:17:14,320 --> 00:17:18,880 Speaker 1: him leaving the company to oversee the businesses himself. Lewandowski 266 00:17:18,920 --> 00:17:23,240 Speaker 1: had indicated that he might leave Google in pursuit of 267 00:17:23,720 --> 00:17:26,200 Speaker 1: leading up these these two companies that he had founded, 268 00:17:26,920 --> 00:17:33,400 Speaker 1: although his commitment to that course of action is something 269 00:17:33,520 --> 00:17:36,560 Speaker 1: that people have questioned that perhaps he just said this 270 00:17:37,000 --> 00:17:41,000 Speaker 1: as a way to encourage Larry Page to shell out 271 00:17:41,240 --> 00:17:45,960 Speaker 1: millions and millions of dollars to acquire these companies. According 272 00:17:46,000 --> 00:17:49,000 Speaker 1: to a report in The New Yorker, several people in 273 00:17:49,160 --> 00:17:53,720 Speaker 1: Project Chauffeur, which was the name the code name for 274 00:17:53,800 --> 00:17:56,840 Speaker 1: the driver less vehicle program at Google, felt that it 275 00:17:56,960 --> 00:18:00,560 Speaker 1: was a mistake to get so tightly connected to levindal Ski. 276 00:18:00,800 --> 00:18:03,520 Speaker 1: Several of the team members questioned his commitment to Google, 277 00:18:03,760 --> 00:18:06,920 Speaker 1: as well as his ethical sensibilities, and this would get 278 00:18:06,920 --> 00:18:11,679 Speaker 1: pushed further after an alleged incident in two thousand eleven. 279 00:18:11,760 --> 00:18:15,240 Speaker 1: So here's how the story goes. There was a Google 280 00:18:15,280 --> 00:18:18,920 Speaker 1: executive named Isaac Taylor who was working on Project Chauffeur, 281 00:18:19,280 --> 00:18:22,440 Speaker 1: and he took a leave of absence from the project 282 00:18:22,880 --> 00:18:25,280 Speaker 1: it was a paternity leave when he became a new 283 00:18:25,320 --> 00:18:28,880 Speaker 1: father in two thousand eleven. When he got back to Google, 284 00:18:28,920 --> 00:18:32,440 Speaker 1: he found out that Lewandowski had made some unauthorized changes 285 00:18:32,800 --> 00:18:36,600 Speaker 1: to the software that guided the driverless cars. So up 286 00:18:36,600 --> 00:18:39,680 Speaker 1: to that point, Google had really placed some pretty tight 287 00:18:39,720 --> 00:18:43,000 Speaker 1: restrictions on the routes that these driverless cars would be 288 00:18:43,040 --> 00:18:46,080 Speaker 1: allowed to take an autonomous mode. This is a type 289 00:18:46,119 --> 00:18:50,240 Speaker 1: of geo fencing, where you restrict the operating parameters for 290 00:18:50,359 --> 00:18:53,560 Speaker 1: a vehicle. The goal was to gather data through many 291 00:18:53,680 --> 00:18:57,160 Speaker 1: miles of travel, but to control that process by limiting 292 00:18:57,240 --> 00:19:00,520 Speaker 1: where the cars could actually drive under a top aymous mode. 293 00:19:00,960 --> 00:19:04,520 Speaker 1: Lewandowski apparently felt this was not satisfactory, so he made 294 00:19:04,600 --> 00:19:06,960 Speaker 1: changes to the code to let the autonomous cars drive 295 00:19:07,080 --> 00:19:12,639 Speaker 1: on routes that previously had been forbidden. Taylor and Lewandowski 296 00:19:12,800 --> 00:19:17,359 Speaker 1: had a rather spirited argument by all accounts, over at Google, 297 00:19:18,000 --> 00:19:21,600 Speaker 1: and at that point Lewandowski demanded that Taylor accompany him 298 00:19:21,840 --> 00:19:24,240 Speaker 1: on a ride in an autonomous car to show that 299 00:19:24,320 --> 00:19:27,800 Speaker 1: this was a good idea. So, according to the story, 300 00:19:27,840 --> 00:19:30,760 Speaker 1: the two people left in a retrofitted Prius and then 301 00:19:30,880 --> 00:19:34,199 Speaker 1: hit the California Rhodes now I'm going to quote the 302 00:19:34,240 --> 00:19:37,320 Speaker 1: New Yorker piece directly here. This is from an article 303 00:19:37,400 --> 00:19:41,600 Speaker 1: titled did Uber steal Google's intellectual Property? And it was 304 00:19:41,640 --> 00:19:45,280 Speaker 1: written by Charles Duig, and it was published October twenty two, 305 00:19:45,560 --> 00:19:48,439 Speaker 1: two thousand eighteen, so you can find this. It's a 306 00:19:48,560 --> 00:19:53,760 Speaker 1: very recent article. And here's Charles Dwig's description of this incident. 307 00:19:54,280 --> 00:19:57,639 Speaker 1: The car went onto a freeway where it traveled past 308 00:19:57,800 --> 00:20:01,080 Speaker 1: an on ramp. According to people with knowledge of events 309 00:20:01,119 --> 00:20:05,679 Speaker 1: that day, the Prius accidentally boxed in another vehicle, a camera. 310 00:20:06,359 --> 00:20:09,800 Speaker 1: A human driver could easily have handled the situation by 311 00:20:09,800 --> 00:20:13,760 Speaker 1: slowing down and laying the camera merge into traffic, but 312 00:20:13,840 --> 00:20:18,159 Speaker 1: Google software wasn't prepared for this scenario. The cars continued 313 00:20:18,200 --> 00:20:21,880 Speaker 1: speeding down the freeway side by side. The cameras driver 314 00:20:22,080 --> 00:20:26,119 Speaker 1: jerked his car onto the right shoulder, then, apparently trying 315 00:20:26,119 --> 00:20:29,439 Speaker 1: to avoid a guardrail, he veered to the left. The 316 00:20:29,560 --> 00:20:34,720 Speaker 1: camera pinwheeled across the freeway and into the median. Lewandowski, 317 00:20:34,800 --> 00:20:38,560 Speaker 1: who acted as the safety driver, swerved hard to avoid 318 00:20:38,600 --> 00:20:42,359 Speaker 1: colliding with the camera, causing Taylor to injure his spine 319 00:20:42,640 --> 00:20:47,360 Speaker 1: so severely that he eventually required multiple surgeries. The Prius 320 00:20:47,400 --> 00:20:50,879 Speaker 1: regained control and turned a corner on the freeway, leaving 321 00:20:50,920 --> 00:20:55,000 Speaker 1: the camera behind. Lewandowski and Taylor didn't know how badly 322 00:20:55,080 --> 00:20:58,440 Speaker 1: damaged the camera was. They didn't go back to check 323 00:20:58,560 --> 00:21:00,879 Speaker 1: on the other driver or to see if anyone else 324 00:21:00,920 --> 00:21:05,200 Speaker 1: had been hurt. Neither they nor other Google executives made 325 00:21:05,280 --> 00:21:09,439 Speaker 1: inquiries with the authorities. The police were not informed that 326 00:21:09,520 --> 00:21:13,720 Speaker 1: a self driving algorithm had contributed to the accident. That's 327 00:21:13,760 --> 00:21:17,680 Speaker 1: not a great story. I mean it's written very well. 328 00:21:17,840 --> 00:21:20,080 Speaker 1: No offense to the New Yorker or anything like that. 329 00:21:20,119 --> 00:21:23,280 Speaker 1: I mean, it's not a great thing to have in 330 00:21:23,359 --> 00:21:27,640 Speaker 1: your historical record, no matter who you are, and it's 331 00:21:27,680 --> 00:21:31,040 Speaker 1: particularly bad for a company that used to have the 332 00:21:31,080 --> 00:21:36,639 Speaker 1: motto don't be evil. Now Lewandowski would actually double down 333 00:21:37,280 --> 00:21:41,000 Speaker 1: on the results of this incident. Rather than saying, whoops, 334 00:21:41,280 --> 00:21:47,600 Speaker 1: my bad, I done did something terrible that caused damage, 335 00:21:47,600 --> 00:21:51,800 Speaker 1: impossibly injury, certainly injury to his coworker, he ended up 336 00:21:51,840 --> 00:21:55,199 Speaker 1: saying that the incident actually provided proof that there was 337 00:21:55,240 --> 00:21:57,760 Speaker 1: work needed on the algorithms, and that Google could learn 338 00:21:57,840 --> 00:22:00,119 Speaker 1: from the mistake, and that this was kind of that 339 00:22:00,240 --> 00:22:04,440 Speaker 1: Silicon Valley philosophy of failure is good because you learn 340 00:22:04,720 --> 00:22:08,760 Speaker 1: from failure, okay well. The article from The New Yorker 341 00:22:08,880 --> 00:22:11,840 Speaker 1: also states that there was at least one accident that 342 00:22:11,920 --> 00:22:15,520 Speaker 1: happened while a driverless car was in autonomous mode that 343 00:22:15,640 --> 00:22:20,119 Speaker 1: did not get reported to to the police or to 344 00:22:20,160 --> 00:22:23,880 Speaker 1: the press. According to the article, a car nicknamed Kit 345 00:22:24,640 --> 00:22:29,160 Speaker 1: after good Old Night Writer, was rear ended at an 346 00:22:29,160 --> 00:22:33,560 Speaker 1: intersection when the autonomous car breaked suddenly after being unable 347 00:22:33,640 --> 00:22:36,080 Speaker 1: to differentiate a yellow light from a red light, so 348 00:22:36,119 --> 00:22:40,200 Speaker 1: they stopped short or stopped too suddenly, and the car 349 00:22:40,240 --> 00:22:43,320 Speaker 1: behind them was unable to stop and ran into them. 350 00:22:43,400 --> 00:22:46,360 Speaker 1: As for Lewandowski, he would stay at Google until two 351 00:22:46,359 --> 00:22:49,439 Speaker 1: thousand and sixteen, when he'd leave in order to go 352 00:22:49,560 --> 00:22:52,480 Speaker 1: found a new company called Auto O t t O. 353 00:22:53,520 --> 00:22:57,320 Speaker 1: Auto would consult with Uber for their driverless car program. 354 00:22:57,680 --> 00:23:01,879 Speaker 1: Google executives would allege that Leven Owlski took proprietary information 355 00:23:01,880 --> 00:23:05,000 Speaker 1: and trade secrets with him in this move. They had 356 00:23:05,040 --> 00:23:09,240 Speaker 1: proof that he had downloaded and transferred an enormous number 357 00:23:09,280 --> 00:23:12,400 Speaker 1: of files, though there were questions about how valuable those 358 00:23:12,440 --> 00:23:17,800 Speaker 1: files actually were, and Uber would ultimately fire Lewandowski as 359 00:23:17,880 --> 00:23:21,760 Speaker 1: this case developed. The actual trial happened in early two 360 00:23:21,760 --> 00:23:24,199 Speaker 1: thousand eighteen and it was marked with a lot of messy, 361 00:23:24,359 --> 00:23:28,560 Speaker 1: complicated legal maneuvers on all sides. One day I may 362 00:23:28,600 --> 00:23:32,240 Speaker 1: have to do a full episode just on that lawsuit 363 00:23:32,400 --> 00:23:34,359 Speaker 1: and what came out of it. But we can skip 364 00:23:34,400 --> 00:23:38,240 Speaker 1: to the end. Google's Weymo, because that's what the that's 365 00:23:38,280 --> 00:23:42,120 Speaker 1: what Project Chauffeur evolved into, was a subsidiary company called 366 00:23:42,160 --> 00:23:45,880 Speaker 1: Weymo would have ultimately settled out of court with Uber. 367 00:23:46,440 --> 00:23:50,280 Speaker 1: All right back to driverless cars in the same year 368 00:23:50,320 --> 00:23:53,640 Speaker 1: that Lewandowski would leave Google. We get to that publicly 369 00:23:53,680 --> 00:23:58,560 Speaker 1: acknowledged accident that was the fault of Google's autonomous technology. 370 00:23:58,880 --> 00:24:02,000 Speaker 1: So this was the first time that the public heard 371 00:24:02,240 --> 00:24:07,000 Speaker 1: about a traffic accident that was quote unquote caused by 372 00:24:07,280 --> 00:24:11,520 Speaker 1: Google's driverless car tech. It happened at another intersection, No 373 00:24:11,640 --> 00:24:15,040 Speaker 1: big surprise there. Here's the description from the incident report 374 00:24:15,240 --> 00:24:18,600 Speaker 1: with the California Department of Motor Vehicles and it goes 375 00:24:18,600 --> 00:24:23,159 Speaker 1: like this. A Google Lexus Model autonomous vehicle Google a 376 00:24:23,320 --> 00:24:27,360 Speaker 1: V was traveling in autonomous mode eastbound on El Camino 377 00:24:27,520 --> 00:24:31,359 Speaker 1: Real in a real I guess in mountain view in 378 00:24:31,440 --> 00:24:35,920 Speaker 1: the far right hand lane, approaching the Castro Street intersection. 379 00:24:36,600 --> 00:24:39,880 Speaker 1: As the Google A V approached the intersection, it signaled 380 00:24:39,920 --> 00:24:42,440 Speaker 1: its intent to make a right turn on the Red 381 00:24:42,720 --> 00:24:45,399 Speaker 1: onto Castro Street. The Google a V then moved to 382 00:24:45,440 --> 00:24:48,120 Speaker 1: the right hand side of the lane to pass traffic 383 00:24:48,200 --> 00:24:51,000 Speaker 1: in the same lane that was stopped at the intersection 384 00:24:51,240 --> 00:24:54,760 Speaker 1: and proceeding straight. However, the Google a V had to 385 00:24:54,840 --> 00:24:58,080 Speaker 1: come to a stop and go around sandbags positioned around 386 00:24:58,080 --> 00:25:01,640 Speaker 1: a storm drain that was blocking path. When the light 387 00:25:01,720 --> 00:25:04,959 Speaker 1: turned green, traffic in the lane continued past the Google 388 00:25:05,040 --> 00:25:07,959 Speaker 1: a V. After a few cars had passed, the Google 389 00:25:08,000 --> 00:25:11,000 Speaker 1: a V began to proceed back into the center of 390 00:25:11,040 --> 00:25:14,840 Speaker 1: the lane to pass the sandbags. A public transit bus 391 00:25:15,080 --> 00:25:18,440 Speaker 1: was approaching from behind the Google a V. Test drivers 392 00:25:18,560 --> 00:25:21,760 Speaker 1: saw the bus approaching in the left side mirror, but 393 00:25:21,880 --> 00:25:24,960 Speaker 1: believed the bus would stop or slow to allow the 394 00:25:24,960 --> 00:25:29,240 Speaker 1: Google a V to continue. Approximately three seconds later, as 395 00:25:29,240 --> 00:25:31,639 Speaker 1: the Google a V was re entering the center of 396 00:25:31,720 --> 00:25:34,920 Speaker 1: the lane, it made contact with the side of the bus. 397 00:25:35,280 --> 00:25:38,159 Speaker 1: The Google a V was operating in autonomous mode and 398 00:25:38,240 --> 00:25:40,679 Speaker 1: traveling at less than two miles per hour, and the 399 00:25:40,720 --> 00:25:43,199 Speaker 1: bus was traveling at about fifteen miles per hour at 400 00:25:43,240 --> 00:25:46,440 Speaker 1: the time of contact. The Google a V sustained body 401 00:25:46,520 --> 00:25:49,399 Speaker 1: damage to the left front fender to the left front wheel, 402 00:25:49,680 --> 00:25:52,199 Speaker 1: and one of its driver's side sensors. There were no 403 00:25:52,240 --> 00:25:55,720 Speaker 1: injuries reported at the scene, so, in other words, this 404 00:25:55,800 --> 00:25:58,679 Speaker 1: was a pretty minor crash all things considered, and the 405 00:25:58,720 --> 00:26:01,840 Speaker 1: report even managed to make it sound like while Google 406 00:26:01,880 --> 00:26:05,080 Speaker 1: would not dispute that this was the fault of the 407 00:26:05,200 --> 00:26:09,080 Speaker 1: driverless car, there were still some shade to cast at 408 00:26:09,119 --> 00:26:13,280 Speaker 1: the bus driver, who was quote unquote expected to stop 409 00:26:13,400 --> 00:26:17,000 Speaker 1: or slow down. So publicly it sounded like driverless cars 410 00:26:17,000 --> 00:26:21,440 Speaker 1: were doing really well around six though in self limited 411 00:26:21,440 --> 00:26:26,240 Speaker 1: tests we weren't seeing autonomous cars sent into unfamiliar territory 412 00:26:26,320 --> 00:26:29,800 Speaker 1: at this point, and assuming the New Yorker article is accurate, 413 00:26:29,840 --> 00:26:33,000 Speaker 1: and I have no reason to assume otherwise, there were 414 00:26:33,080 --> 00:26:37,520 Speaker 1: at least several incidents that could have changed public perception 415 00:26:37,560 --> 00:26:40,439 Speaker 1: about how safe those cars really were. They were just 416 00:26:40,520 --> 00:26:43,399 Speaker 1: being kept on the hush hush. I've got more to 417 00:26:43,440 --> 00:26:46,720 Speaker 1: say about Google and other companies and driverless cars in 418 00:26:46,760 --> 00:26:49,200 Speaker 1: just a second, but first let's take another quick break 419 00:26:49,359 --> 00:26:59,040 Speaker 1: to thank our sponsor. We'll come back to Google in 420 00:26:59,040 --> 00:27:01,440 Speaker 1: a moment, but first let's chat a little bit about 421 00:27:01,440 --> 00:27:05,359 Speaker 1: some of the other companies that are pursuing driverless car technology. 422 00:27:05,600 --> 00:27:07,480 Speaker 1: One of them I've already mentioned a couple of times, 423 00:27:07,480 --> 00:27:10,479 Speaker 1: and that would be Uber. Now. At the two thousand 424 00:27:10,600 --> 00:27:15,000 Speaker 1: fourteen re Slash Code conference the Code Conferences was known 425 00:27:15,040 --> 00:27:19,320 Speaker 1: as Travis Kalenik, who is CEO and or was the 426 00:27:19,359 --> 00:27:23,720 Speaker 1: CEO and the co founder of Uber, talked about autonomous 427 00:27:23,760 --> 00:27:27,320 Speaker 1: cars in a pretty ominous way, I would say. He said, 428 00:27:27,440 --> 00:27:31,520 Speaker 1: quote the reason Uber could be expensive is you're paying 429 00:27:31,600 --> 00:27:34,879 Speaker 1: for the other dude in the car when there is 430 00:27:34,960 --> 00:27:38,000 Speaker 1: no other dude in the car. The cost of taking 431 00:27:38,040 --> 00:27:41,000 Speaker 1: an Uber anywhere is cheaper even on a road trip. 432 00:27:41,200 --> 00:27:44,320 Speaker 1: End quote. And by dude in the car, kalen Nick 433 00:27:44,440 --> 00:27:48,920 Speaker 1: was talking about the driver. So here's a guy who 434 00:27:48,960 --> 00:27:53,560 Speaker 1: is the head of a ride haling service, a company 435 00:27:53,600 --> 00:27:58,800 Speaker 1: that employs thousands of drivers, say yeah, but man, can't 436 00:27:58,840 --> 00:28:00,960 Speaker 1: you just imagine what the world would be like if 437 00:28:00,960 --> 00:28:04,879 Speaker 1: we didn't have to pay those drivers. The message was 438 00:28:04,960 --> 00:28:08,600 Speaker 1: that trips would cost less for customers. It would also 439 00:28:08,680 --> 00:28:11,199 Speaker 1: mean that Uber would be able to keep more of 440 00:28:11,240 --> 00:28:13,960 Speaker 1: the money. It wouldn't have to share any cash out 441 00:28:14,000 --> 00:28:17,440 Speaker 1: to any employees, at least not any drivers, and it 442 00:28:17,480 --> 00:28:20,560 Speaker 1: would put the company's own drivers out of work. If 443 00:28:20,600 --> 00:28:22,840 Speaker 1: such a future were to come true, it was more 444 00:28:22,880 --> 00:28:25,360 Speaker 1: than a little harsh. I would say, I would call 445 00:28:25,400 --> 00:28:28,000 Speaker 1: it demoralizing if I were driving for Uber and I 446 00:28:28,080 --> 00:28:32,320 Speaker 1: heard that the company's leadership was saying, I can't wait 447 00:28:32,359 --> 00:28:35,680 Speaker 1: to replace you with a robot. It makes me feel 448 00:28:35,720 --> 00:28:39,240 Speaker 1: not so great about my job. Klinik also put forth 449 00:28:39,280 --> 00:28:41,880 Speaker 1: an idea that frequently comes along with the concept of 450 00:28:41,920 --> 00:28:45,760 Speaker 1: autonomous cars, which would be the end of private car ownership. 451 00:28:46,560 --> 00:28:49,400 Speaker 1: Rather than purchasing a car, at least in a densely 452 00:28:50,320 --> 00:28:54,160 Speaker 1: in populated environment, you would just use a driver less 453 00:28:54,240 --> 00:28:57,800 Speaker 1: vehicle to take you places. The cost per trip could 454 00:28:57,800 --> 00:29:01,240 Speaker 1: potentially be low enough that you would actually be saving 455 00:29:01,280 --> 00:29:06,120 Speaker 1: money compared to purchasing your own car, plus paying insurance 456 00:29:06,120 --> 00:29:08,720 Speaker 1: and all that kind of stuff, not to mention maintenance 457 00:29:08,760 --> 00:29:10,840 Speaker 1: and parking and that kind of thing. All of those 458 00:29:11,080 --> 00:29:13,360 Speaker 1: payments would be gone. You would just be paying on 459 00:29:13,400 --> 00:29:16,720 Speaker 1: a per trip basis. And in fact, a lot of 460 00:29:16,720 --> 00:29:20,960 Speaker 1: the futures involving driverless cars, a lot of the future 461 00:29:21,000 --> 00:29:25,560 Speaker 1: scenarios that various people project, assume that we're mostly going 462 00:29:25,560 --> 00:29:28,840 Speaker 1: to be interacting with fleets owned by car hailing services 463 00:29:28,920 --> 00:29:33,880 Speaker 1: like Uber and Lift. We won't own these driverless cars 464 00:29:33,920 --> 00:29:37,520 Speaker 1: and these visions of the future, because why would we 465 00:29:37,520 --> 00:29:40,920 Speaker 1: We would just use a ride hailing service. Again, this 466 00:29:41,000 --> 00:29:43,520 Speaker 1: only makes sense if you're living in a fairly densely 467 00:29:43,600 --> 00:29:48,240 Speaker 1: populated area, UH city or a suburb. It makes less 468 00:29:48,240 --> 00:29:52,040 Speaker 1: sense the further out from a city you live, but 469 00:29:52,240 --> 00:29:55,280 Speaker 1: you could see it working at least in some scenarios. 470 00:29:56,640 --> 00:30:00,760 Speaker 1: Uber's pursuit has made plenty of headline over the years, 471 00:30:00,760 --> 00:30:05,120 Speaker 1: including some very grim ones. In March two thousand eighteen, 472 00:30:05,720 --> 00:30:10,200 Speaker 1: a woman crossing Mill Avenue in Tempe, Arizona, was struck 473 00:30:10,280 --> 00:30:13,160 Speaker 1: by a self driving Uber vehicle. It was actually in 474 00:30:13,320 --> 00:30:17,600 Speaker 1: autonomous mode, and the woman died from her injuries. The 475 00:30:17,640 --> 00:30:21,720 Speaker 1: car was a Volvo x C nine suv. There were 476 00:30:21,720 --> 00:30:25,680 Speaker 1: no passengers in the vehicle apart from the human operator. UH. 477 00:30:25,720 --> 00:30:28,600 Speaker 1: The human operator was supposed to act as emergency backup, 478 00:30:29,120 --> 00:30:31,640 Speaker 1: but again, the car was in self driving mode at 479 00:30:31,640 --> 00:30:35,200 Speaker 1: the time of the accident. UH. Google would go on 480 00:30:35,360 --> 00:30:39,840 Speaker 1: to say that the driver was distracted and that Google's 481 00:30:39,920 --> 00:30:42,920 Speaker 1: system would have been able to respond appropriately in time, 482 00:30:42,960 --> 00:30:46,920 Speaker 1: which maybe true, but I'm not sure as the classiest 483 00:30:46,920 --> 00:30:50,920 Speaker 1: thing to say in the wake of someone's death. Anyway, 484 00:30:51,000 --> 00:30:54,840 Speaker 1: Uber would suspend all autonomous testing in Arizona as well 485 00:30:54,880 --> 00:30:58,880 Speaker 1: as in Pittsburgh, San Francisco, and Toronto. In response, and 486 00:30:59,400 --> 00:31:03,400 Speaker 1: the state of Arizona was essentially really come down on 487 00:31:03,680 --> 00:31:06,480 Speaker 1: Uber and Uber would just completely pull the plug in 488 00:31:06,560 --> 00:31:10,440 Speaker 1: Arizona for the time being anyway, as far as autonomous 489 00:31:10,440 --> 00:31:13,840 Speaker 1: cars go. An investigation revealed that the possible cause of 490 00:31:13,880 --> 00:31:17,680 Speaker 1: the accident was that the vehicle's emergency braking system had 491 00:31:17,720 --> 00:31:22,760 Speaker 1: been disabled when an autonomous mode. Again, since that accident, 492 00:31:22,840 --> 00:31:25,040 Speaker 1: Uber has pulled out of Arizona. They've also laid off 493 00:31:25,080 --> 00:31:27,320 Speaker 1: some of the employees who are working in the autonomous 494 00:31:27,320 --> 00:31:30,880 Speaker 1: car division. UH they have petition to renew testing and 495 00:31:30,960 --> 00:31:35,320 Speaker 1: other cities. But UM and they're not done, you know. 496 00:31:35,400 --> 00:31:37,800 Speaker 1: Even though they had a setback, they're not out of 497 00:31:37,840 --> 00:31:41,600 Speaker 1: the driverless car business. They're still working toward developing a 498 00:31:41,680 --> 00:31:45,560 Speaker 1: driverless car fleet. In fact, the company had ordered twenty 499 00:31:45,600 --> 00:31:50,080 Speaker 1: four thousand self driving Volvo SUVs, which are scheduled to 500 00:31:50,160 --> 00:31:54,320 Speaker 1: begin shipping in two thousand nineteen. Toyota also announced it 501 00:31:54,360 --> 00:31:58,000 Speaker 1: would invest a half a billion dollars in Uber. This 502 00:31:58,120 --> 00:32:01,840 Speaker 1: was an addition to a obviously existing partnership between the 503 00:32:01,880 --> 00:32:05,160 Speaker 1: two companies, and that Toyota was going to incorporate Uber 504 00:32:05,280 --> 00:32:08,320 Speaker 1: self driving technology into its C and M and events 505 00:32:08,640 --> 00:32:13,040 Speaker 1: for its own sort of a ride sharing fleet service. Meanwhile, 506 00:32:13,200 --> 00:32:16,440 Speaker 1: Lift is also chasing after a self driving autonomous fleet. 507 00:32:16,680 --> 00:32:19,600 Speaker 1: UM they made a partnership with General Motors and GM 508 00:32:19,600 --> 00:32:22,600 Speaker 1: has its own initiative for a self driving fleet that 509 00:32:22,640 --> 00:32:25,920 Speaker 1: one's called Cruise. That on was scheduled to launch also 510 00:32:25,960 --> 00:32:29,800 Speaker 1: in two thousand nineteen. Ford also has plans for self 511 00:32:29,880 --> 00:32:32,720 Speaker 1: driving cars and a ride sharing capacity. Like I said, 512 00:32:32,760 --> 00:32:35,120 Speaker 1: that seems to be the the big model is the 513 00:32:35,160 --> 00:32:40,280 Speaker 1: idea that these cars would be prohibitively expensive for most consumers, 514 00:32:40,680 --> 00:32:42,720 Speaker 1: so it doesn't make a whole lot of business sense 515 00:32:43,160 --> 00:32:47,840 Speaker 1: to build them out for your average private car owner. 516 00:32:48,360 --> 00:32:51,000 Speaker 1: It makes more sense to build them out for a 517 00:32:51,120 --> 00:32:55,520 Speaker 1: ride sharing service where you will generate revenue over a 518 00:32:55,600 --> 00:32:58,200 Speaker 1: long term, as opposed to trying to sell them for 519 00:32:58,240 --> 00:33:02,360 Speaker 1: a profit to a single person. And uh Ford, however, 520 00:33:02,600 --> 00:33:06,600 Speaker 1: they're there, uh their timelines a little more modest. They're 521 00:33:06,600 --> 00:33:10,800 Speaker 1: looking at starting in twe Then there's Tesla and it's 522 00:33:10,840 --> 00:33:14,400 Speaker 1: autopilot feature, which I should add has been marketed as 523 00:33:14,440 --> 00:33:18,200 Speaker 1: a driver assist feature. It has not been marketed as 524 00:33:18,200 --> 00:33:21,880 Speaker 1: a truly autonomous vehicle option, or at least that's not 525 00:33:21,960 --> 00:33:27,520 Speaker 1: the official corporate messaging. Tesla can get a little cheeky 526 00:33:27,560 --> 00:33:30,280 Speaker 1: with the way they message stuff out, but all the 527 00:33:30,320 --> 00:33:34,880 Speaker 1: official use cases state that as a driver, you're supposed 528 00:33:34,920 --> 00:33:37,360 Speaker 1: to keep your hands on the wheel at all times. 529 00:33:37,560 --> 00:33:40,479 Speaker 1: You're not supposed to give up control. Uh, you can 530 00:33:40,520 --> 00:33:44,880 Speaker 1: allow it to take over, but you aren't to just 531 00:33:44,960 --> 00:33:47,400 Speaker 1: sit back and watch it happen, which, of course has 532 00:33:47,440 --> 00:33:50,760 Speaker 1: not stopped people from doing just that thing, despite the 533 00:33:50,760 --> 00:33:52,840 Speaker 1: fact that the company has said don't do that thing. 534 00:33:53,880 --> 00:33:57,040 Speaker 1: The company initially rolled out the autopilot feature as a 535 00:33:57,080 --> 00:34:00,120 Speaker 1: software update, which is actually really cool. The idea of 536 00:34:00,600 --> 00:34:03,280 Speaker 1: rolling out a software update and sending it out to 537 00:34:03,400 --> 00:34:06,360 Speaker 1: cars and suddenly they have this autopilot feature that's really neat. 538 00:34:06,400 --> 00:34:09,160 Speaker 1: That's something that you wouldn't have seen, you know, five 539 00:34:09,239 --> 00:34:12,600 Speaker 1: years ago with cars. So that happened back in two 540 00:34:12,600 --> 00:34:15,239 Speaker 1: thousand fifteen, so I guess you'd see it three years ago, 541 00:34:15,320 --> 00:34:18,920 Speaker 1: but not five years ago. But this semi autonomous system 542 00:34:19,000 --> 00:34:22,600 Speaker 1: has been involved in at least two fatal car crashes 543 00:34:22,600 --> 00:34:25,640 Speaker 1: over the last few years. The first happened in Steen 544 00:34:25,680 --> 00:34:28,600 Speaker 1: when a Tesla car and autopilot mode collided with a 545 00:34:28,680 --> 00:34:32,600 Speaker 1: truck that was turning across the path of traffic. The 546 00:34:32,920 --> 00:34:36,520 Speaker 1: Tesla failed to detect the truck was there and was 547 00:34:36,719 --> 00:34:40,600 Speaker 1: unable to stop in time. The second fatal accident happened 548 00:34:40,680 --> 00:34:44,400 Speaker 1: in when a Tesla vehicle and autopilot collided with a 549 00:34:44,440 --> 00:34:48,120 Speaker 1: concrete highway lane divider at high speed. Then we get 550 00:34:48,120 --> 00:34:51,520 Speaker 1: back to way MOW. That's spinoff from Google that's probably 551 00:34:51,600 --> 00:34:55,360 Speaker 1: the most famous of all the autonomous vehicle projects. According 552 00:34:55,360 --> 00:34:57,839 Speaker 1: to the company, it's test vehicles have driven more than 553 00:34:57,960 --> 00:35:02,799 Speaker 1: ten million miles in autonomous mode, though to Lewandowski's point, 554 00:35:03,360 --> 00:35:07,080 Speaker 1: some considerations should be dedicated to the fact that many 555 00:35:07,160 --> 00:35:11,399 Speaker 1: of those miles were over very specific routes. So yes, 556 00:35:11,920 --> 00:35:16,480 Speaker 1: millions of miles traveled, but if they are millions of 557 00:35:16,520 --> 00:35:21,200 Speaker 1: the same miles, that has limited utility. In December two 558 00:35:21,239 --> 00:35:25,880 Speaker 1: thousand eighteen, Weymo is launching, or perhaps has already launched, 559 00:35:25,960 --> 00:35:29,360 Speaker 1: depending on when you hear this, a self driving service 560 00:35:29,440 --> 00:35:32,520 Speaker 1: available to people who have opted into the company's early 561 00:35:32,600 --> 00:35:37,120 Speaker 1: writer program and the services in Phoenix, Arizona, which I 562 00:35:37,160 --> 00:35:39,319 Speaker 1: assume is a delicate matter in the wake of the 563 00:35:39,360 --> 00:35:43,799 Speaker 1: tragedy that happened in Tempe, Arizona in spring of with 564 00:35:43,880 --> 00:35:48,359 Speaker 1: that Uber crash. The service is called Weymo one and 565 00:35:48,400 --> 00:35:51,600 Speaker 1: like other ride hailing services, customers will use an app 566 00:35:51,840 --> 00:35:54,359 Speaker 1: to hail a car. They will designate where they want 567 00:35:54,400 --> 00:35:56,279 Speaker 1: to be picked up where they want to be dropped off. 568 00:35:57,280 --> 00:36:00,440 Speaker 1: The cost of the ride will be dependent upon actors, 569 00:36:00,520 --> 00:36:02,879 Speaker 1: like what time of day it is and how far 570 00:36:02,960 --> 00:36:06,359 Speaker 1: away the trip or how how much distance the trip 571 00:36:06,400 --> 00:36:11,839 Speaker 1: will take. The cars are Chrysler PACIFICA minivans, and they 572 00:36:11,880 --> 00:36:15,120 Speaker 1: do have some controls on the inside for passengers to 573 00:36:15,400 --> 00:36:19,279 Speaker 1: use in various situations. There's a button that can initiate 574 00:36:19,640 --> 00:36:22,319 Speaker 1: essentially an emergency pull over, so you can have the 575 00:36:22,320 --> 00:36:25,480 Speaker 1: car pull over at any point during the ride. UH. 576 00:36:25,560 --> 00:36:29,520 Speaker 1: There's also a contact support button that will put a 577 00:36:29,520 --> 00:36:34,440 Speaker 1: passenger in UH in contact with customer support. So it 578 00:36:34,440 --> 00:36:36,120 Speaker 1: sounds like two thousand nine is going to be a 579 00:36:36,160 --> 00:36:40,320 Speaker 1: really big and potentially scary experiment to see if autonomous 580 00:36:40,360 --> 00:36:44,160 Speaker 1: cars are really ready to enter into real service, at 581 00:36:44,239 --> 00:36:47,239 Speaker 1: least in limited markets, and it may well be that 582 00:36:47,280 --> 00:36:51,320 Speaker 1: the results will see will show these vehicles are more safe, reliable, 583 00:36:51,320 --> 00:36:54,840 Speaker 1: and efficient than vehicles that are operated by human drivers. 584 00:36:55,280 --> 00:36:58,680 Speaker 1: But if there are more accidents or evidence of companies 585 00:36:58,680 --> 00:37:01,960 Speaker 1: trying to cover up at SENS. That's gonna be a 586 00:37:02,000 --> 00:37:06,480 Speaker 1: big problem. It also shows how there's a huge disconnect 587 00:37:06,760 --> 00:37:11,520 Speaker 1: between that Silicon Valley philosophy of fail big, fail fast 588 00:37:11,600 --> 00:37:15,640 Speaker 1: fail often, you know, making risky decisions, and Silicon Valley 589 00:37:15,800 --> 00:37:18,200 Speaker 1: is considered to be a really good trait, not a 590 00:37:18,239 --> 00:37:21,279 Speaker 1: bad one. Being risk averse is a bad trait in 591 00:37:21,320 --> 00:37:24,799 Speaker 1: Silicon Valley. It's better to keep throwing yourself out there 592 00:37:25,120 --> 00:37:28,640 Speaker 1: as hard as you can without any fear of failure, 593 00:37:28,960 --> 00:37:32,600 Speaker 1: because if you do succeed, you've gotta reap incredible benefits. 594 00:37:32,760 --> 00:37:35,200 Speaker 1: And if you fail, well, you just you learn something 595 00:37:35,360 --> 00:37:37,319 Speaker 1: in that process and then you just do better the 596 00:37:37,360 --> 00:37:42,160 Speaker 1: next time. You know, people like Lewandowski appear to embrace 597 00:37:42,239 --> 00:37:44,799 Speaker 1: that philosophy. But on the other hand, when you get 598 00:37:44,840 --> 00:37:48,239 Speaker 1: to your average schmos like me, then you have the 599 00:37:48,280 --> 00:37:51,000 Speaker 1: reality of the situation sync in. Because these aren't just 600 00:37:51,160 --> 00:37:54,959 Speaker 1: lines of code in software. These are technologies that could 601 00:37:55,239 --> 00:37:59,839 Speaker 1: dramatically change someone's life in really horrible ways if something 602 00:38:00,080 --> 00:38:03,040 Speaker 1: is wrong. See if I use an app, Let's say 603 00:38:03,040 --> 00:38:05,520 Speaker 1: I'm using I don't know a pizza delivery app, and 604 00:38:05,600 --> 00:38:08,760 Speaker 1: let's say something screws up and my pizza is lost 605 00:38:09,000 --> 00:38:12,560 Speaker 1: in the ether like that, that order never really goes through, 606 00:38:13,000 --> 00:38:15,480 Speaker 1: and I'm staying around waiting for pizza and I'm hungry. Well, 607 00:38:15,480 --> 00:38:19,120 Speaker 1: that sucks, but I'm gonna live. I can order another pizza. 608 00:38:19,640 --> 00:38:23,120 Speaker 1: But if an autonomous car messes up, someone could die. 609 00:38:23,880 --> 00:38:28,360 Speaker 1: So failing big failing often in the in the sense 610 00:38:28,640 --> 00:38:32,319 Speaker 1: of autonomous cars in the real world, is what I 611 00:38:32,320 --> 00:38:36,960 Speaker 1: would argue an irresponsible approach. And yes, we learn more 612 00:38:37,000 --> 00:38:41,840 Speaker 1: through our failures than we do through our successes, but again, 613 00:38:41,880 --> 00:38:44,000 Speaker 1: when it comes to human lives, you can't be too 614 00:38:44,040 --> 00:38:49,080 Speaker 1: cavalier about that. Now, this naturally leads into what will 615 00:38:49,120 --> 00:38:51,960 Speaker 1: be the topic of the next episode of tech Stuff, 616 00:38:52,000 --> 00:38:56,960 Speaker 1: which is a broader, more philosophical discussion about autonomous cars 617 00:38:57,000 --> 00:39:00,719 Speaker 1: and ethics, not to mention realistic versus un balistic views 618 00:39:00,800 --> 00:39:03,200 Speaker 1: of where we are with the technology and where we 619 00:39:03,239 --> 00:39:05,759 Speaker 1: should be or where we need to be in order 620 00:39:05,800 --> 00:39:09,399 Speaker 1: to have a broad rollout, so to speak, of the tech. 621 00:39:09,719 --> 00:39:12,000 Speaker 1: So in the next episode, I'm going to spend more 622 00:39:12,040 --> 00:39:16,080 Speaker 1: time talking about those sort of big picture ideas, everything 623 00:39:16,120 --> 00:39:20,080 Speaker 1: from the different levels of autonomy two where we are 624 00:39:20,680 --> 00:39:25,920 Speaker 1: two various problems with some of these approaches even Weymo, 625 00:39:26,400 --> 00:39:33,920 Speaker 1: which I would argue despite the the alleged incidents that 626 00:39:33,960 --> 00:39:37,080 Speaker 1: went underreported between two thousand eleven and two thousand fourteen, 627 00:39:38,080 --> 00:39:42,239 Speaker 1: it appears to be the most responsible of the the 628 00:39:42,560 --> 00:39:46,840 Speaker 1: large projects I've read about again Big Picture of You, 629 00:39:47,560 --> 00:39:52,439 Speaker 1: even Weymo. Their approach has certain drawbacks that will talk 630 00:39:52,480 --> 00:39:56,799 Speaker 1: about in the next episode. Things that UH mean that 631 00:39:57,280 --> 00:40:01,000 Speaker 1: it may work in most situations. When you get into 632 00:40:01,080 --> 00:40:06,160 Speaker 1: unusual situations, which happen all the time, uh, it doesn't 633 00:40:06,200 --> 00:40:09,120 Speaker 1: work nearly as well. But that's what we'll talk about 634 00:40:09,160 --> 00:40:12,400 Speaker 1: our next episode. If you have suggestions for future episodes 635 00:40:12,480 --> 00:40:16,680 Speaker 1: of tech Stuff, whether it's a huge topic like driverless 636 00:40:16,719 --> 00:40:19,680 Speaker 1: cars that would require multiple episodes, or something where you 637 00:40:19,719 --> 00:40:23,319 Speaker 1: want to really focused episode about specific technology or a 638 00:40:23,360 --> 00:40:25,640 Speaker 1: person in tech. Maybe there's someone you would like me 639 00:40:25,680 --> 00:40:28,200 Speaker 1: to interview, let me know. Send me an email the 640 00:40:28,200 --> 00:40:32,680 Speaker 1: addresses tech Stuff at how stuff works dot com, or 641 00:40:32,840 --> 00:40:36,359 Speaker 1: go to our website that's tech Stuff Podcast dot com. 642 00:40:36,520 --> 00:40:38,640 Speaker 1: There you can find the different ways to contact me 643 00:40:38,680 --> 00:40:41,399 Speaker 1: on social I look forward to hearing from you. Don't 644 00:40:41,440 --> 00:40:44,200 Speaker 1: forget to head over to t public dot com slash 645 00:40:44,239 --> 00:40:47,480 Speaker 1: tech stuff and check out our merchandise. Every purchase you 646 00:40:47,520 --> 00:40:49,799 Speaker 1: make goes to help the show. We greatly appreciate it, 647 00:40:50,160 --> 00:40:58,759 Speaker 1: and I'll talk to you again really soon for more 648 00:40:58,800 --> 00:41:01,080 Speaker 1: on this and pathans of other topics. Is that how 649 00:41:01,120 --> 00:41:12,000 Speaker 1: stuff Works dot com