1 00:00:00,080 --> 00:00:02,720 Speaker 1: Hey guys, this is Jonathan. I have a quick announcement 2 00:00:02,759 --> 00:00:05,400 Speaker 1: before we get to today's episode, which is sort of 3 00:00:05,480 --> 00:00:08,760 Speaker 1: a capper on the autonomous car episodes we listened to 4 00:00:09,119 --> 00:00:12,879 Speaker 1: last week, and that is starting this week, Tech Stuff 5 00:00:12,920 --> 00:00:16,320 Speaker 1: is going back to publishing two new episodes and one 6 00:00:16,360 --> 00:00:20,239 Speaker 1: classic episode each week, so we're cutting back a little bit. 7 00:00:20,360 --> 00:00:22,920 Speaker 1: This is so that I can spend more time putting 8 00:00:22,920 --> 00:00:25,720 Speaker 1: these episodes together, making sure they are the absolute best 9 00:00:25,800 --> 00:00:28,400 Speaker 1: I can make them, as well as making time to 10 00:00:28,520 --> 00:00:31,440 Speaker 1: work on some other shows like The Brink. If you're 11 00:00:31,480 --> 00:00:33,159 Speaker 1: not listening to The Brink, you should go check that 12 00:00:33,200 --> 00:00:36,280 Speaker 1: out and some more shows that might be coming down 13 00:00:36,840 --> 00:00:39,360 Speaker 1: in the near future once we get through pilot season. 14 00:00:39,560 --> 00:00:42,760 Speaker 1: Here at How Stuff Works, and I just want to 15 00:00:42,840 --> 00:00:45,160 Speaker 1: let you guys all be aware of that nothing bad 16 00:00:45,400 --> 00:00:47,839 Speaker 1: is happening. These are good things. It means that I 17 00:00:47,880 --> 00:00:51,400 Speaker 1: get to spend more time on this stuff and I'm 18 00:00:51,440 --> 00:00:55,320 Speaker 1: not quite as a as rushed as I have been 19 00:00:55,360 --> 00:00:59,040 Speaker 1: for the last half year. So enjoy and I'll see 20 00:00:59,080 --> 00:01:06,760 Speaker 1: you guys again on Wednesday. Get in touch with technology 21 00:01:06,880 --> 00:01:13,520 Speaker 1: with tech Stuff from how stuff works dot com. Hey there, 22 00:01:13,520 --> 00:01:16,800 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 23 00:01:16,840 --> 00:01:19,399 Speaker 1: I'm an executive producer with How Stuff Works, and I 24 00:01:19,520 --> 00:01:23,759 Speaker 1: heart radio and I love all things tech, and we 25 00:01:23,920 --> 00:01:29,760 Speaker 1: are continuing our long journey with autonomous cars. Actually we 26 00:01:29,800 --> 00:01:33,319 Speaker 1: are concluding it at this point, even though as I 27 00:01:33,360 --> 00:01:35,559 Speaker 1: get to the end of the episode you will hear 28 00:01:36,120 --> 00:01:39,200 Speaker 1: about how I could cover even more. But out of 29 00:01:39,240 --> 00:01:42,200 Speaker 1: consideration for you, my dear listeners, I'll move on to 30 00:01:42,319 --> 00:01:44,319 Speaker 1: other topics for a while. But last week I did 31 00:01:44,319 --> 00:01:47,960 Speaker 1: a whole series of episodes about the history of autonomous cars, 32 00:01:48,400 --> 00:01:51,120 Speaker 1: and we started with the science fiction visions of the 33 00:01:51,200 --> 00:01:55,440 Speaker 1: nineteen twenties and radio controlled vehicles up to the upcoming 34 00:01:55,520 --> 00:01:59,360 Speaker 1: launch of the Weymo one service in Phoenix, Arizona, which 35 00:02:00,120 --> 00:02:03,480 Speaker 1: maybe in operation by the time you hear this. And 36 00:02:03,520 --> 00:02:07,160 Speaker 1: I mentioned how over the last several years, in particular, 37 00:02:07,520 --> 00:02:11,320 Speaker 1: a few accidents, a couple of them fatal accidents, have 38 00:02:11,440 --> 00:02:14,960 Speaker 1: brought a lot more critical attention to driverless cars, even 39 00:02:15,000 --> 00:02:19,640 Speaker 1: as numerous services prepared to unleash thousands of them on 40 00:02:19,680 --> 00:02:24,000 Speaker 1: the streets of several cities across the world. In this episode, 41 00:02:24,280 --> 00:02:27,440 Speaker 1: I'm going to talk about the arguments for and against 42 00:02:27,480 --> 00:02:32,799 Speaker 1: autonomous cars, or at least against early autonomous car deployment. 43 00:02:33,240 --> 00:02:38,360 Speaker 1: I'm gonna talk about how these things are complicating factors. 44 00:02:38,400 --> 00:02:41,680 Speaker 1: It's not just a technological problem. In fact, I don't 45 00:02:41,680 --> 00:02:44,760 Speaker 1: really get into it in this episode, but there are 46 00:02:45,040 --> 00:02:50,760 Speaker 1: entire issues with regulation and legislation. Like many other technologies, 47 00:02:51,040 --> 00:02:56,160 Speaker 1: autonomous cars has outpaced the law, and so there are 48 00:02:56,160 --> 00:02:59,440 Speaker 1: a lot of places that are trying to take into 49 00:02:59,480 --> 00:03:03,280 Speaker 1: account what driverless cars in the streets would actually mean, 50 00:03:03,360 --> 00:03:05,880 Speaker 1: and how do you legislate that. Who do you find 51 00:03:05,880 --> 00:03:08,440 Speaker 1: at fault in the event of an accident? These are 52 00:03:08,480 --> 00:03:12,280 Speaker 1: still largely open questions. There are other ones that I 53 00:03:12,280 --> 00:03:15,440 Speaker 1: haven't touched on in this episode either, For example, the 54 00:03:15,480 --> 00:03:19,440 Speaker 1: trolley problem, the idea that if you have to build 55 00:03:19,480 --> 00:03:23,440 Speaker 1: into your car system a decision making process to follow 56 00:03:23,560 --> 00:03:27,400 Speaker 1: in the event that an injury is unavoidable, you've got 57 00:03:27,440 --> 00:03:31,800 Speaker 1: and and a scenario where no matter what, someone is 58 00:03:31,800 --> 00:03:34,880 Speaker 1: going to get hurt. How does the car make the 59 00:03:34,960 --> 00:03:40,600 Speaker 1: choice of who gets hurt. That's a huge question in ethics, 60 00:03:40,640 --> 00:03:44,920 Speaker 1: in artificial intelligence in general, and in autonomous cars in particular. 61 00:03:45,640 --> 00:03:47,480 Speaker 1: I don't really go into that in this episode, so 62 00:03:47,520 --> 00:03:50,880 Speaker 1: there are still many topics in this field that I 63 00:03:50,920 --> 00:03:53,640 Speaker 1: will have to come back to at some future point. However, 64 00:03:53,800 --> 00:03:56,480 Speaker 1: that being said, there's still a ton of stuff to 65 00:03:56,560 --> 00:03:59,240 Speaker 1: talk about outside of that, and I'm going to start 66 00:03:59,320 --> 00:04:02,240 Speaker 1: off by exam and some of the arguments for autonomous cars, 67 00:04:02,240 --> 00:04:06,880 Speaker 1: as well as the criticisms leveled at those arguments now 68 00:04:06,880 --> 00:04:10,200 Speaker 1: before we jump into this, because it's going to sound 69 00:04:10,240 --> 00:04:14,680 Speaker 1: like I'm totally down on the concept, like I'm like, 70 00:04:14,720 --> 00:04:18,200 Speaker 1: I'm really against the idea of driverless cars. That's what 71 00:04:18,279 --> 00:04:20,120 Speaker 1: it might sound like to you as we go through 72 00:04:20,120 --> 00:04:22,760 Speaker 1: this episode. But that is not the case. And I'm 73 00:04:22,800 --> 00:04:25,920 Speaker 1: gonna be real with you, guys. I don't drive. I 74 00:04:25,920 --> 00:04:29,520 Speaker 1: I do not have a driver's license. And I don't 75 00:04:29,560 --> 00:04:32,360 Speaker 1: talk about this very much because it's hard to talk 76 00:04:32,360 --> 00:04:35,960 Speaker 1: about because driving in the United States is such a 77 00:04:35,960 --> 00:04:41,440 Speaker 1: a common experience among many many people. I mean, unless 78 00:04:41,440 --> 00:04:44,480 Speaker 1: you're living in a city that has incredible public transportation 79 00:04:44,839 --> 00:04:50,000 Speaker 1: and driving is incredibly expensive, then you probably drive a car, 80 00:04:50,160 --> 00:04:53,240 Speaker 1: Especially in the Southeast where I live. Owning a car 81 00:04:53,320 --> 00:04:55,960 Speaker 1: and driving a car is very much tied with the 82 00:04:55,960 --> 00:04:59,440 Speaker 1: thought of independence. But I don't have a driver's license. 83 00:04:59,480 --> 00:05:03,000 Speaker 1: I actually have a phobia about driving. I end up 84 00:05:03,160 --> 00:05:05,200 Speaker 1: locking up when I get behind the wheel of a 85 00:05:05,240 --> 00:05:07,839 Speaker 1: car and I become a danger on the road, and 86 00:05:07,880 --> 00:05:11,559 Speaker 1: I'm aware of that, and then that also feeds into 87 00:05:11,600 --> 00:05:13,520 Speaker 1: my fear and it gets worse and worse. So you 88 00:05:13,560 --> 00:05:18,480 Speaker 1: see where I'm going. I I have realized that I 89 00:05:18,480 --> 00:05:20,760 Speaker 1: am a danger to myself and others if I get 90 00:05:20,800 --> 00:05:22,400 Speaker 1: behind the wheel of a car. I am not a 91 00:05:22,440 --> 00:05:27,200 Speaker 1: good driver. So this is an incredibly frustrating condition for me, 92 00:05:27,279 --> 00:05:29,320 Speaker 1: and it's difficult for me to talk about because there's 93 00:05:29,360 --> 00:05:33,039 Speaker 1: a big stigma against this. But it means that I 94 00:05:33,120 --> 00:05:36,479 Speaker 1: frequently have to rely upon other people to help me 95 00:05:36,520 --> 00:05:39,720 Speaker 1: get around, and that means I often feel like a burden, 96 00:05:40,000 --> 00:05:42,480 Speaker 1: even when people go out of their way to dismiss 97 00:05:42,560 --> 00:05:44,880 Speaker 1: that idea, saying no, no, no, no, I want to 98 00:05:44,920 --> 00:05:46,799 Speaker 1: give you a ride. I want you to be involved 99 00:05:46,839 --> 00:05:50,839 Speaker 1: in this thing. I can't help but feel otherwise. So 100 00:05:51,200 --> 00:05:53,800 Speaker 1: I very much want a future in which we have 101 00:05:53,960 --> 00:05:58,440 Speaker 1: a reliable, safe, autonomous car service where I can call 102 00:05:58,560 --> 00:06:01,560 Speaker 1: upon a car anytime need and it will show up. 103 00:06:01,600 --> 00:06:06,159 Speaker 1: And I'm not putting anyone out right, I'm not making 104 00:06:06,200 --> 00:06:10,360 Speaker 1: it inconvenient for somebody else. It's a service. I'm paying 105 00:06:10,400 --> 00:06:14,240 Speaker 1: for it. Everything's cool. I'm eager to see that kind 106 00:06:14,240 --> 00:06:18,640 Speaker 1: of thing across the world. That being said, I also 107 00:06:18,720 --> 00:06:23,320 Speaker 1: recognize that I can't allow my desire for that technology 108 00:06:23,360 --> 00:06:26,640 Speaker 1: to come to maturity to guide all my thinking on 109 00:06:26,680 --> 00:06:31,640 Speaker 1: the matter to bias my point of view. The responsible 110 00:06:31,680 --> 00:06:35,080 Speaker 1: thing to do is to hold up the facts to 111 00:06:35,279 --> 00:06:39,880 Speaker 1: critical thought and consideration. After all, we're ultimately talking about 112 00:06:40,480 --> 00:06:43,960 Speaker 1: enormous vehicles traveling at potentially very high speeds, and that's 113 00:06:43,960 --> 00:06:47,480 Speaker 1: a lot of force. So it's only right that we 114 00:06:47,520 --> 00:06:51,239 Speaker 1: look at all the facts before we leap to any conclusions. 115 00:06:52,120 --> 00:06:55,919 Speaker 1: And knowing how badly I want autonomous cars to become 116 00:06:55,920 --> 00:06:59,120 Speaker 1: a thing has to be a warning sign to me. 117 00:06:59,200 --> 00:07:01,599 Speaker 1: I have to say, well, I need to be extra 118 00:07:01,760 --> 00:07:04,680 Speaker 1: careful when I'm looking at the facts so that I 119 00:07:04,720 --> 00:07:10,040 Speaker 1: don't just become latch on to the most optimistic outlooks 120 00:07:10,200 --> 00:07:12,840 Speaker 1: and then just, you know, deny that there are any problems. 121 00:07:13,200 --> 00:07:17,440 Speaker 1: So the first big argument for autonomous cars in favor 122 00:07:17,600 --> 00:07:20,760 Speaker 1: of them tends to be that, assuming you have cars 123 00:07:20,800 --> 00:07:25,000 Speaker 1: that perform safely and consistently, you would cut way back 124 00:07:25,360 --> 00:07:29,080 Speaker 1: on the number of fatalities and injuries caused by car accidents. 125 00:07:29,400 --> 00:07:32,440 Speaker 1: And the general line of reasoning is that machines can 126 00:07:32,480 --> 00:07:34,560 Speaker 1: react in a fraction of the time that people can, 127 00:07:34,720 --> 00:07:39,360 Speaker 1: which is absolutely true, and a well designed vehicle, one 128 00:07:39,400 --> 00:07:43,040 Speaker 1: that has the proper sensors, could maintain a three D 129 00:07:43,240 --> 00:07:47,880 Speaker 1: sixty degree awareness at all times, whereas a human can 130 00:07:47,960 --> 00:07:51,640 Speaker 1: only focus on a small part of their overall field 131 00:07:51,640 --> 00:07:55,040 Speaker 1: of view, which is locked to whichever way they happen 132 00:07:55,160 --> 00:07:58,400 Speaker 1: to turn their face at any given time. That's also 133 00:07:58,520 --> 00:08:02,960 Speaker 1: true assuming you've designed the sensors and the processing system 134 00:08:02,960 --> 00:08:07,880 Speaker 1: properly so that it can accept that data and analyze 135 00:08:07,880 --> 00:08:09,960 Speaker 1: it in real time and be able to react to 136 00:08:10,040 --> 00:08:15,480 Speaker 1: it in real time. Now, this argument that autonomous cars 137 00:08:15,480 --> 00:08:20,040 Speaker 1: will save lives also assumes that the software the algorithms 138 00:08:20,160 --> 00:08:24,280 Speaker 1: guiding the car's behaviors have been properly designed and tested 139 00:08:24,360 --> 00:08:28,160 Speaker 1: and they're reliable. Whether that means programming the car for 140 00:08:28,240 --> 00:08:32,760 Speaker 1: specific scenarios or using a process like machine learning in 141 00:08:32,800 --> 00:08:36,120 Speaker 1: which an autonomous system is quote unquote taught how to 142 00:08:36,280 --> 00:08:41,719 Speaker 1: handle various driving scenarios. Now, that assumption is particularly important, 143 00:08:42,240 --> 00:08:45,599 Speaker 1: and it also is one that we should be particularly 144 00:08:45,800 --> 00:08:49,080 Speaker 1: critical of. Now, I don't mean to suggest that it 145 00:08:49,200 --> 00:08:54,200 Speaker 1: is impossible to design a sufficiently adept system or a 146 00:08:54,240 --> 00:08:58,440 Speaker 1: sufficiently a depth set of algorithms but rather we should 147 00:08:58,440 --> 00:09:01,040 Speaker 1: be as certain as we can and be as we 148 00:09:01,120 --> 00:09:05,040 Speaker 1: can reasonably expect us to be that such systems meet 149 00:09:05,080 --> 00:09:10,040 Speaker 1: are required levels of reliability before we start getting into 150 00:09:10,200 --> 00:09:14,360 Speaker 1: driverless vehicles for any sort of you know, given trip. Now, 151 00:09:14,400 --> 00:09:18,400 Speaker 1: all that being said, statistics do show that humans are 152 00:09:18,679 --> 00:09:23,160 Speaker 1: far from perfect drivers. The U s Department of Transportation 153 00:09:23,240 --> 00:09:28,520 Speaker 1: produces reports from the Fatality Analysis Reporting System or FARS, 154 00:09:29,240 --> 00:09:31,960 Speaker 1: and yes, that's pretty grim, but it's very important work. 155 00:09:32,360 --> 00:09:35,320 Speaker 1: The two thousand sixteen numbers, which were some of the 156 00:09:35,320 --> 00:09:38,240 Speaker 1: more recent numbers that I could get hold of, say 157 00:09:38,240 --> 00:09:40,840 Speaker 1: that in the United States in two thousand and sixteen, 158 00:09:41,160 --> 00:09:45,839 Speaker 1: there were thirty four thousand, four hundred thirty nine recorded 159 00:09:45,920 --> 00:09:50,200 Speaker 1: traffic accidents that had one or more fatalities. So if 160 00:09:50,200 --> 00:09:53,439 Speaker 1: you look at all the actual deaths, you're looking at 161 00:09:53,480 --> 00:09:56,760 Speaker 1: thirty seven thousand, four hundred sixty one people who lost 162 00:09:56,800 --> 00:09:59,880 Speaker 1: their lives from car accidents. So when you take into 163 00:09:59,880 --> 00:10:03,400 Speaker 1: a account the population of the United States, that averages 164 00:10:03,440 --> 00:10:07,400 Speaker 1: out to eleven point six deaths due to car accidents 165 00:10:07,480 --> 00:10:10,679 Speaker 1: per one hundred thousand people. When you break it out 166 00:10:10,720 --> 00:10:13,400 Speaker 1: state by state and you look at their statistics, you 167 00:10:13,559 --> 00:10:17,679 Speaker 1: see where the concentration is highest. Mississippi had the highest 168 00:10:17,760 --> 00:10:21,320 Speaker 1: fatality rate by this metric, at twenty three point one 169 00:10:21,400 --> 00:10:25,520 Speaker 1: fatalities per one hundred thousand people. The report also takes 170 00:10:25,520 --> 00:10:29,760 Speaker 1: into account the number of vehicle miles traveled. It's very 171 00:10:29,800 --> 00:10:34,400 Speaker 1: important as well, because that tells you how frequently people 172 00:10:34,400 --> 00:10:36,480 Speaker 1: were on the roads and how far they went, and 173 00:10:36,480 --> 00:10:39,200 Speaker 1: how much time they were spending on the roads. So 174 00:10:39,240 --> 00:10:44,440 Speaker 1: in sixteen, the estimated total miles traveled in vehicles in 175 00:10:44,480 --> 00:10:49,840 Speaker 1: the United States was three trillion, two hundred twenty billion, 176 00:10:50,120 --> 00:10:56,120 Speaker 1: six hundred seventy seven million miles. It's a really important 177 00:10:56,200 --> 00:10:57,880 Speaker 1: number that we're going to come back to in just 178 00:10:57,960 --> 00:11:01,760 Speaker 1: a second. But another important number is that in the 179 00:11:01,880 --> 00:11:06,520 Speaker 1: United States, experts estimate that human error causes somewhere between 180 00:11:08,000 --> 00:11:14,000 Speaker 1: and of all car accidents. Uh, there's discrepancy because it 181 00:11:14,040 --> 00:11:18,160 Speaker 1: all depends upon which study you're looking at, but the 182 00:11:18,160 --> 00:11:21,760 Speaker 1: the message there is the overwhelming majority of car accidents 183 00:11:21,760 --> 00:11:25,080 Speaker 1: in the United States are caused by human error, as 184 00:11:25,080 --> 00:11:29,120 Speaker 1: opposed to say, mechanical failure or other extenuating circumstances like 185 00:11:29,520 --> 00:11:33,480 Speaker 1: a tree falling on a car. So it's easy to 186 00:11:33,559 --> 00:11:37,319 Speaker 1: see how the thought of removing the human element from 187 00:11:37,400 --> 00:11:41,960 Speaker 1: driving could dramatically reduce the number of crashes. But again, 188 00:11:42,000 --> 00:11:45,360 Speaker 1: this assumes that the driver list systems are in fact 189 00:11:45,720 --> 00:11:50,800 Speaker 1: more reliable more safe than human drivers. Now they likely are, 190 00:11:51,520 --> 00:11:56,120 Speaker 1: but it's hard for us to say we've definitively proven this. 191 00:11:56,400 --> 00:11:58,320 Speaker 1: In fact, that would be misleading to say we have 192 00:11:58,440 --> 00:12:02,720 Speaker 1: definitively proven these cars are more reliable and more safe 193 00:12:03,640 --> 00:12:08,640 Speaker 1: under automated control than human control, And the reason for 194 00:12:08,720 --> 00:12:12,839 Speaker 1: that is because of the amount of data. This is 195 00:12:12,880 --> 00:12:16,600 Speaker 1: where we start to really get scientific with our thinking. 196 00:12:17,120 --> 00:12:20,560 Speaker 1: So the safety argument says, imagine a future in which 197 00:12:20,600 --> 00:12:24,520 Speaker 1: all cars are autonomous and can detect one another. They 198 00:12:24,520 --> 00:12:27,480 Speaker 1: can react quickly to changes in the environment much much 199 00:12:27,559 --> 00:12:32,400 Speaker 1: faster than humans can. So then imagine how drastic the 200 00:12:32,440 --> 00:12:36,240 Speaker 1: reduction of fatalities would be. Those nearly forty thou people 201 00:12:36,640 --> 00:12:40,920 Speaker 1: who died in would still be alive in a world 202 00:12:41,200 --> 00:12:44,760 Speaker 1: where those cars were the way we got around. But 203 00:12:44,840 --> 00:12:48,880 Speaker 1: this presupposes that the autonomous cars would perform as we 204 00:12:48,920 --> 00:12:51,959 Speaker 1: would like them too, and that still remains to be seen. 205 00:12:53,040 --> 00:12:57,800 Speaker 1: Google's way MOW, which is the company that I think 206 00:12:57,920 --> 00:13:02,600 Speaker 1: is the the foremost authority on driverless cars as far 207 00:13:02,640 --> 00:13:07,280 Speaker 1: as public awareness is concerned. At least, they like to 208 00:13:07,320 --> 00:13:10,560 Speaker 1: point out that their self driving vehicles have passed the 209 00:13:10,720 --> 00:13:14,720 Speaker 1: ten million mile mark. In other words, autonomous vehicles have 210 00:13:14,880 --> 00:13:18,760 Speaker 1: driven more than ten million miles and there have been 211 00:13:18,800 --> 00:13:23,040 Speaker 1: no accidents in Weymo vehicles that resulted in a fatality 212 00:13:23,280 --> 00:13:27,200 Speaker 1: from their operation in autonomous mode. And ten million is 213 00:13:27,200 --> 00:13:29,959 Speaker 1: a big number. Ten million miles. That's a lot of distance. 214 00:13:30,040 --> 00:13:35,520 Speaker 1: But remember that in humans in the United States drove 215 00:13:35,880 --> 00:13:40,960 Speaker 1: three trillion miles in one year. You know, Weymo is 216 00:13:41,000 --> 00:13:45,280 Speaker 1: looking at the entire history of their autonomous car program 217 00:13:45,320 --> 00:13:49,400 Speaker 1: in one year. Because of the number of drivers and 218 00:13:49,559 --> 00:13:53,400 Speaker 1: how much we drive, humans racked up more than three 219 00:13:53,520 --> 00:13:58,520 Speaker 1: trillion miles. So we have more human drivers on the streets, 220 00:13:58,800 --> 00:14:02,480 Speaker 1: which also makes it difficult to draw any sort of 221 00:14:02,520 --> 00:14:05,880 Speaker 1: comparison between human drivers and autonomous drivers. There are more 222 00:14:05,960 --> 00:14:09,520 Speaker 1: human drivers out there and they're driving more miles than 223 00:14:09,559 --> 00:14:13,040 Speaker 1: autonomous cars are, so it's not an apples to apples 224 00:14:13,160 --> 00:14:17,680 Speaker 1: kind of comparison. We cannot say definitively that autonomous cars 225 00:14:17,679 --> 00:14:21,320 Speaker 1: are safer. They just haven't been driving enough and there 226 00:14:21,360 --> 00:14:23,760 Speaker 1: aren't enough of them out there for us to make 227 00:14:23,800 --> 00:14:27,520 Speaker 1: that kind of general conclusion. Does that mean it's wrong? 228 00:14:27,840 --> 00:14:31,520 Speaker 1: Not necessarily. It just means we lack the information to 229 00:14:31,600 --> 00:14:35,480 Speaker 1: be able to say it definitively, and without that information, 230 00:14:36,000 --> 00:14:39,720 Speaker 1: there's doubt, and when there's doubt, you have problems. In 231 00:14:39,760 --> 00:14:42,240 Speaker 1: the last couple of years, there have been some pretty 232 00:14:42,320 --> 00:14:47,160 Speaker 1: high profile incidents involving autonomous cars and accidents, and these 233 00:14:47,200 --> 00:14:51,720 Speaker 1: accidents don't necessarily mean that the safety argument for autonomous 234 00:14:51,720 --> 00:14:55,840 Speaker 1: cars is invalid. We haven't seen a rash of driver 235 00:14:55,920 --> 00:14:59,000 Speaker 1: list car accidents. We've only seen a couple like a 236 00:14:59,040 --> 00:15:04,040 Speaker 1: few dozen, and many of those aren't even the fault 237 00:15:04,160 --> 00:15:06,640 Speaker 1: of the autonomous system, but rather the fault of human 238 00:15:06,720 --> 00:15:08,320 Speaker 1: drivers who happened to be on the road at the 239 00:15:08,400 --> 00:15:12,360 Speaker 1: same time. However, even if we were to sift through 240 00:15:12,520 --> 00:15:17,640 Speaker 1: all of the data and prove definitively, statistically speaking, that 241 00:15:17,760 --> 00:15:22,000 Speaker 1: driverless cars are safer than one driven by an average person, 242 00:15:22,800 --> 00:15:26,320 Speaker 1: it might not matter at the end of the day. Now, 243 00:15:26,360 --> 00:15:30,360 Speaker 1: this is because we humans are not entirely rational, and 244 00:15:30,440 --> 00:15:33,800 Speaker 1: I am guessing this does not come as a shocking 245 00:15:34,320 --> 00:15:38,960 Speaker 1: revelation to you guys. Psychologically, it can be easier for 246 00:15:39,040 --> 00:15:44,760 Speaker 1: us to accept that driving a car on roads dominated 247 00:15:44,760 --> 00:15:49,680 Speaker 1: by other human drivers is risky. We understand this because 248 00:15:49,720 --> 00:15:53,760 Speaker 1: we know that driving, while often a mundane task that 249 00:15:53,800 --> 00:15:57,000 Speaker 1: we can kind of do without even devoting our full 250 00:15:57,040 --> 00:16:00,760 Speaker 1: attention to it, can sometimes be punctuated by moments of 251 00:16:00,920 --> 00:16:05,560 Speaker 1: unique circumstances, and that in those circumstances, the wrong reaction 252 00:16:05,880 --> 00:16:09,200 Speaker 1: could cause an accident. This is something we can wrap 253 00:16:09,240 --> 00:16:13,120 Speaker 1: our minds around. We can accept this. We know accidents happen, 254 00:16:13,280 --> 00:16:17,240 Speaker 1: sometimes we are in them, sometimes we cause them, and 255 00:16:17,280 --> 00:16:20,880 Speaker 1: that is something that we can accept. But there's a 256 00:16:20,920 --> 00:16:23,600 Speaker 1: bit of a leap for an awful lot of people 257 00:16:23,920 --> 00:16:27,520 Speaker 1: when you shift that two vehicles that are controlled not 258 00:16:27,640 --> 00:16:32,440 Speaker 1: by human drivers but by machines and algorithms. Suddenly the 259 00:16:32,480 --> 00:16:36,400 Speaker 1: same people who accept that driving can be inherently dangerous 260 00:16:36,440 --> 00:16:40,040 Speaker 1: in certain situations find it much more difficult to accept 261 00:16:40,320 --> 00:16:45,240 Speaker 1: when humans are no longer directly responsible. The journal Risk 262 00:16:45,360 --> 00:16:49,720 Speaker 1: Analysis published the results of a survey in which people 263 00:16:49,760 --> 00:16:54,040 Speaker 1: were asked to determine how safe an autonomous vehicle would 264 00:16:54,080 --> 00:16:57,800 Speaker 1: have to be before they would feel comfortable with the 265 00:16:57,840 --> 00:17:01,680 Speaker 1: idea of them being on the road. The results were 266 00:17:01,680 --> 00:17:05,720 Speaker 1: really interesting. They showed that, on average, people felt that 267 00:17:05,800 --> 00:17:08,639 Speaker 1: autonomous cars would have to be proven to be four 268 00:17:08,840 --> 00:17:13,560 Speaker 1: to five times safer than human operated vehicles for them 269 00:17:13,560 --> 00:17:18,600 Speaker 1: to feel comfortable four to five times safer. So let 270 00:17:18,640 --> 00:17:23,199 Speaker 1: that sink in because by this argument, even if the 271 00:17:23,320 --> 00:17:28,280 Speaker 1: driverless car was twice as safe as the average human driver, 272 00:17:28,880 --> 00:17:31,760 Speaker 1: which by definition would mean that if you had enough 273 00:17:31,760 --> 00:17:34,560 Speaker 1: of these replacing human drivers out on the road, would 274 00:17:34,640 --> 00:17:38,800 Speaker 1: mean you would have fewer injuries, fewer cases of vehicle 275 00:17:38,960 --> 00:17:43,440 Speaker 1: or property damage, fewer fatalities. But that wouldn't be enough 276 00:17:43,760 --> 00:17:47,600 Speaker 1: for these people to feel comfortable even if they knew that, 277 00:17:47,760 --> 00:17:51,760 Speaker 1: as a result, fewer people were being hurt or killed, 278 00:17:52,080 --> 00:17:55,200 Speaker 1: there was less damage. And when you turn this argument 279 00:17:55,200 --> 00:17:58,520 Speaker 1: on its side, it's kind of like saying it's okay 280 00:17:58,560 --> 00:18:01,520 Speaker 1: if a certain number of people are hurt or killed 281 00:18:01,760 --> 00:18:06,840 Speaker 1: because we didn't shift to autonomous cars. It's only when 282 00:18:06,840 --> 00:18:12,159 Speaker 1: autonomous cars are far more safe and capable four to 283 00:18:12,280 --> 00:18:16,200 Speaker 1: five times than human drivers that people start to feel 284 00:18:16,200 --> 00:18:19,119 Speaker 1: better about them. And I think you could argue that 285 00:18:19,160 --> 00:18:23,639 Speaker 1: it's a pretty extreme double standard. We're not requiring humans 286 00:18:24,320 --> 00:18:29,320 Speaker 1: to undergo driving tests more frequently to prove that they 287 00:18:29,480 --> 00:18:33,040 Speaker 1: are just as safe or not if not safer, than 288 00:18:33,080 --> 00:18:34,960 Speaker 1: they were the last time they took a test. We're 289 00:18:34,960 --> 00:18:38,280 Speaker 1: not requiring people to do that. But it is the 290 00:18:38,320 --> 00:18:42,040 Speaker 1: sort of stuff we're demanding from the machines, and it's 291 00:18:42,160 --> 00:18:45,680 Speaker 1: also something I can kind of understand because it means 292 00:18:45,760 --> 00:18:48,639 Speaker 1: taking the humans out of the equation, or seeming to, 293 00:18:49,080 --> 00:18:52,160 Speaker 1: since these cars and algorithms are ultimately designed by humans 294 00:18:53,119 --> 00:18:58,280 Speaker 1: and that's scary. The idea of entrusting lives to machines 295 00:18:59,320 --> 00:19:02,480 Speaker 1: so is a bit of a paradox. So to summarize 296 00:19:02,520 --> 00:19:05,879 Speaker 1: the safety part of the argument, we don't yet have 297 00:19:06,040 --> 00:19:09,680 Speaker 1: enough information to really know how safe autonomous cars are 298 00:19:09,720 --> 00:19:13,920 Speaker 1: compared to human drivers, although I suspect they are as 299 00:19:14,000 --> 00:19:17,880 Speaker 1: a whole safer, but again hard to tell because you're 300 00:19:17,920 --> 00:19:22,800 Speaker 1: extrapolating a lot from a very small data sample size. 301 00:19:23,400 --> 00:19:26,879 Speaker 1: We don't know because we lack that information. There just 302 00:19:26,960 --> 00:19:29,960 Speaker 1: aren't enough driverless cars in the road that haven't driven 303 00:19:30,040 --> 00:19:34,040 Speaker 1: nearly enough miles and enough different environments in order for 304 00:19:34,119 --> 00:19:37,080 Speaker 1: us to draw any firm conclusions. We also have to 305 00:19:37,080 --> 00:19:41,000 Speaker 1: remember that different companies are taking very different approaches to 306 00:19:41,160 --> 00:19:44,520 Speaker 1: driverless cars, so we can't make any sweeping statements because 307 00:19:44,520 --> 00:19:47,480 Speaker 1: one approach might be better in some situations and worse 308 00:19:47,560 --> 00:19:50,800 Speaker 1: than others, with no single approach being the best for 309 00:19:50,880 --> 00:19:55,240 Speaker 1: all scenarios or regions. We do know human drivers are 310 00:19:55,280 --> 00:19:58,840 Speaker 1: fallible and that we are at fault in the overwhelming 311 00:19:58,880 --> 00:20:03,560 Speaker 1: majority of car accidents, and we suspect that eventually will 312 00:20:03,600 --> 00:20:06,520 Speaker 1: have autonomous cars that can operate more safely than people. 313 00:20:07,520 --> 00:20:11,679 Speaker 1: We just don't know if we're there already or if 314 00:20:11,720 --> 00:20:14,159 Speaker 1: we still have a ways to go. Now. I'll have 315 00:20:14,240 --> 00:20:17,400 Speaker 1: more to say about the arguments for and the arguments 316 00:20:17,400 --> 00:20:19,840 Speaker 1: against autonomous cars in just a second, but first let's 317 00:20:19,880 --> 00:20:29,879 Speaker 1: take a quick break to thank our sponsor. So, the 318 00:20:29,960 --> 00:20:34,400 Speaker 1: safety argument also presents a couple of other possibilities. Besides 319 00:20:34,640 --> 00:20:38,719 Speaker 1: an improvement in safety, assuming that everything is working properly. Uh, 320 00:20:39,600 --> 00:20:43,640 Speaker 1: it also presents the possibility that autonomous cars might require 321 00:20:44,200 --> 00:20:48,040 Speaker 1: less material and less energy to operate. So here here's 322 00:20:48,040 --> 00:20:50,719 Speaker 1: the thought. If your vehicles are proven to be safe 323 00:20:51,440 --> 00:20:54,760 Speaker 1: and you don't have to worry about them getting into accidents, 324 00:20:54,840 --> 00:20:56,840 Speaker 1: let's say that you have reached a point where you're 325 00:20:56,840 --> 00:21:01,159 Speaker 1: ready to really deploy these, then you don't have to 326 00:21:01,200 --> 00:21:05,359 Speaker 1: build your vehicles quite as sturdily. Right. One of the 327 00:21:05,400 --> 00:21:09,960 Speaker 1: reasons why cars and trucks and other you know, consumer 328 00:21:10,040 --> 00:21:13,960 Speaker 1: vehicles way as much as they do is because the 329 00:21:14,000 --> 00:21:16,639 Speaker 1: materials they're made out of are in part there to 330 00:21:16,720 --> 00:21:20,479 Speaker 1: protect the people inside in case of accident. But if 331 00:21:20,520 --> 00:21:23,560 Speaker 1: you've removed the possibility of accident, then you don't have 332 00:21:23,640 --> 00:21:29,600 Speaker 1: to build them quite as Hardy will say, so it 333 00:21:29,600 --> 00:21:33,120 Speaker 1: means you could go to lighter materials to make your vehicles. Well, 334 00:21:33,200 --> 00:21:36,600 Speaker 1: lighter vehicles need less energy to move around than a 335 00:21:36,600 --> 00:21:39,640 Speaker 1: heavier vehicle does. You don't have to put as much 336 00:21:39,680 --> 00:21:42,240 Speaker 1: force to get the vehicle moving or to stop it. 337 00:21:42,760 --> 00:21:45,200 Speaker 1: So if you reduce the weight of vehicles on the road, 338 00:21:45,960 --> 00:21:48,600 Speaker 1: you also reduce the amount of energy you need to 339 00:21:48,680 --> 00:21:53,280 Speaker 1: move them around. That also means less fuel consumption, which 340 00:21:53,320 --> 00:21:56,000 Speaker 1: could be in the vehicle itself if it's a gasoline 341 00:21:56,000 --> 00:21:59,879 Speaker 1: powered vehicle or hydrogen powered vehicle, or in a power 342 00:22:00,040 --> 00:22:03,640 Speaker 1: plant that produces the electricity that is going to charge 343 00:22:03,720 --> 00:22:07,800 Speaker 1: those electric vehicles batteries. And that sounds pretty good, except 344 00:22:07,840 --> 00:22:12,199 Speaker 1: I would argue we can't really afford to get fully 345 00:22:12,320 --> 00:22:17,199 Speaker 1: behind switching over to light vehicles until we reach a 346 00:22:17,280 --> 00:22:22,040 Speaker 1: real tipping point of the percentage of driverless cars versus 347 00:22:22,200 --> 00:22:25,920 Speaker 1: human operated ones, because even if we assume robot cars 348 00:22:25,960 --> 00:22:29,720 Speaker 1: are perfect, which again this is just like hypothetical, just 349 00:22:29,840 --> 00:22:32,359 Speaker 1: assume that they work exactly the way they're supposed to. 350 00:22:32,960 --> 00:22:34,720 Speaker 1: You still have humans on the road and they could 351 00:22:34,760 --> 00:22:38,320 Speaker 1: still cause accidents. And for that reason, we can't really 352 00:22:38,359 --> 00:22:42,680 Speaker 1: afford to essue safety features that add weight to vehicles 353 00:22:42,720 --> 00:22:46,000 Speaker 1: just yet. We need to still have those considerations in place. 354 00:22:46,800 --> 00:22:50,520 Speaker 1: Another big argument for autonomous cars is that they could 355 00:22:50,560 --> 00:22:55,000 Speaker 1: have a positive influence on traffic, meaning a reduction in 356 00:22:55,080 --> 00:22:59,159 Speaker 1: traffic and traffic jams and congestion. But the studies on 357 00:22:59,200 --> 00:23:04,520 Speaker 1: this subject of conflicting answers. The National Science Foundation funded 358 00:23:04,560 --> 00:23:07,240 Speaker 1: a study to look into this matter, and the study 359 00:23:07,320 --> 00:23:11,359 Speaker 1: consulted with auto industry experts and government officials. And in 360 00:23:11,359 --> 00:23:15,160 Speaker 1: that study, the researchers created a simulation in which one 361 00:23:15,240 --> 00:23:20,439 Speaker 1: autonomous vehicle, a computer controlled simulated vehicle, circled a track 362 00:23:20,720 --> 00:23:23,560 Speaker 1: with at least twenty vehicles driven by humans. It was 363 00:23:23,600 --> 00:23:27,159 Speaker 1: a simulation that used VR, so these were VR cars 364 00:23:27,240 --> 00:23:31,400 Speaker 1: driven by human beings and not just you know, twenty 365 00:23:31,720 --> 00:23:34,480 Speaker 1: simulated human beings, because then you're talking about two different 366 00:23:34,480 --> 00:23:37,960 Speaker 1: types of autonomous cars, and that wouldn't tell you anything anyway. 367 00:23:38,119 --> 00:23:41,440 Speaker 1: This experiment saw that humans create a stop and go 368 00:23:41,680 --> 00:23:47,480 Speaker 1: oscillation effect in traffic, but that autonomous cars don't cause 369 00:23:47,680 --> 00:23:51,000 Speaker 1: this kind of oscillation effect, and this can happen in 370 00:23:51,040 --> 00:23:54,080 Speaker 1: things like when cars change lanes or they need to 371 00:23:54,840 --> 00:23:57,760 Speaker 1: exit the highway or into the highway, and this can 372 00:23:57,840 --> 00:23:59,840 Speaker 1: lead to traffic jams. So if you've ever been on 373 00:24:00,080 --> 00:24:03,200 Speaker 1: highway and you hit a super slow section of traffic 374 00:24:03,720 --> 00:24:06,520 Speaker 1: and then the traffic starts to clear up, and you 375 00:24:06,600 --> 00:24:10,640 Speaker 1: never see any reason for the traffic jam, right, there's 376 00:24:10,640 --> 00:24:14,240 Speaker 1: no accident, there's no reason, nothing to rubber neck at. 377 00:24:15,119 --> 00:24:17,919 Speaker 1: You might wonder what the heck happened, Well, that's this phenomenon, 378 00:24:18,000 --> 00:24:23,320 Speaker 1: this oscillating effect. It probably means that someone somewhere did 379 00:24:23,359 --> 00:24:27,879 Speaker 1: this stop and go approach and it ended up perpetuating 380 00:24:28,040 --> 00:24:33,320 Speaker 1: throughout the traffic from that point behind the car. So 381 00:24:33,359 --> 00:24:37,679 Speaker 1: the researchers concluded that if even only five percent of 382 00:24:37,720 --> 00:24:40,399 Speaker 1: the cars on the road were to be autonomous, it 383 00:24:40,440 --> 00:24:43,359 Speaker 1: would cut down on traffic problems and it would reduce 384 00:24:43,560 --> 00:24:46,280 Speaker 1: fuel consumption because you wouldn't be doing the stop and go, 385 00:24:46,440 --> 00:24:49,400 Speaker 1: you'd be using fuel more efficiently, and that in turn 386 00:24:49,440 --> 00:24:52,720 Speaker 1: would mean less pollution admitted as a result. So it 387 00:24:52,800 --> 00:24:55,280 Speaker 1: has this ripple effect, which sounds great. It's kind of 388 00:24:55,320 --> 00:24:58,080 Speaker 1: similar to the safety argument. And then the fact that 389 00:24:58,080 --> 00:25:01,359 Speaker 1: that would mean less fuel consumption, less volution. You start 390 00:25:01,400 --> 00:25:05,720 Speaker 1: to see real promise here. However, there's a flip side 391 00:25:05,720 --> 00:25:09,320 Speaker 1: to this argument that we cannot ignore. While various studies 392 00:25:09,480 --> 00:25:13,800 Speaker 1: have argued that concentrations of between five to of autonomous 393 00:25:13,880 --> 00:25:17,160 Speaker 1: cars on the road would really help with traffic problems, 394 00:25:17,480 --> 00:25:20,880 Speaker 1: and a full conversion to autonomous cars would be presumably 395 00:25:21,000 --> 00:25:25,119 Speaker 1: even better, this often presupposes that the number of cars 396 00:25:25,119 --> 00:25:27,880 Speaker 1: on the road would remain pretty much the same as 397 00:25:27,920 --> 00:25:32,359 Speaker 1: they are today, but some studies suggest that just isn't 398 00:25:32,480 --> 00:25:37,320 Speaker 1: necessarily the case. For example, the World Economic Forum and 399 00:25:37,400 --> 00:25:41,080 Speaker 1: the Boston Consulting Group partnered to conduct a study about 400 00:25:41,119 --> 00:25:46,040 Speaker 1: how driverless cars might impact traffic in and around Boston. 401 00:25:46,680 --> 00:25:50,120 Speaker 1: According to their research, they found that driverless cars would 402 00:25:50,119 --> 00:25:55,159 Speaker 1: actually increase traffic in downtown Boston. Their findings suggested that 403 00:25:55,240 --> 00:26:00,119 Speaker 1: people would opt to use driverless cars over other alternatives 404 00:26:00,160 --> 00:26:04,320 Speaker 1: such as public transportation. So maybe some people would give 405 00:26:04,400 --> 00:26:07,320 Speaker 1: up driving and let the robots do it. And then 406 00:26:07,320 --> 00:26:10,000 Speaker 1: in that case, you're just changing a human driven car 407 00:26:10,080 --> 00:26:12,480 Speaker 1: for a robot driven car. No big deal, it's a 408 00:26:12,520 --> 00:26:15,200 Speaker 1: one for one swamp. But there would be other people 409 00:26:15,280 --> 00:26:17,800 Speaker 1: who would give up taking the bus or the train 410 00:26:18,359 --> 00:26:21,480 Speaker 1: in favor of more direct door to door service using 411 00:26:21,560 --> 00:26:25,640 Speaker 1: these driverless car ride hailing services, And so it's possible 412 00:26:25,680 --> 00:26:30,280 Speaker 1: that for some areas, driverless cars would actually increase congestion 413 00:26:30,440 --> 00:26:35,840 Speaker 1: in certain municipal corridors like downtown Boston. Now that's not 414 00:26:35,920 --> 00:26:38,959 Speaker 1: an isolated study, by the way, it's not just the 415 00:26:39,040 --> 00:26:44,479 Speaker 1: one outlier. AUDI conducted a similar research project in Europe. 416 00:26:45,200 --> 00:26:49,280 Speaker 1: The company focused on Ingelstadt, Germany. Now that happens to 417 00:26:49,320 --> 00:26:52,120 Speaker 1: be where AUDI headquarters are, so I suppose that has 418 00:26:52,320 --> 00:26:55,680 Speaker 1: something to do with their choice for location. But beyond that, 419 00:26:56,000 --> 00:26:59,600 Speaker 1: Audi said that this city represented a really good case 420 00:26:59,600 --> 00:27:02,680 Speaker 1: study because it's a medium sized city and it has 421 00:27:02,840 --> 00:27:07,320 Speaker 1: very limited public transportation options. And those studies suggest that 422 00:27:07,440 --> 00:27:11,719 Speaker 1: a more important component than adding driverless cars might be 423 00:27:11,880 --> 00:27:15,199 Speaker 1: just to increase the number of people riding in each vehicle. 424 00:27:15,800 --> 00:27:20,040 Speaker 1: According to Audi's research, if the average car worked out 425 00:27:20,080 --> 00:27:24,120 Speaker 1: to holding one point three passengers, so in other words, 426 00:27:24,160 --> 00:27:25,840 Speaker 1: you just average it all out, you get one point 427 00:27:25,880 --> 00:27:29,040 Speaker 1: three passengers per per car instead of one point one, 428 00:27:29,320 --> 00:27:32,439 Speaker 1: commute times would drop as much as even if you 429 00:27:32,480 --> 00:27:36,760 Speaker 1: increase the number of people taking cars, so true ride 430 00:27:36,800 --> 00:27:40,520 Speaker 1: sharing would make a bigger difference than autonomous cars. Whether 431 00:27:40,560 --> 00:27:44,040 Speaker 1: it's a person driving the vehicle or a robot driving 432 00:27:44,080 --> 00:27:46,199 Speaker 1: the vehicle, the more important part is that you have 433 00:27:46,320 --> 00:27:49,720 Speaker 1: more people sharing that car. But the company was also 434 00:27:49,800 --> 00:27:52,760 Speaker 1: quick to point out that what applies to one city 435 00:27:52,800 --> 00:27:55,600 Speaker 1: may not apply to other cities, so it's not like 436 00:27:55,640 --> 00:28:00,359 Speaker 1: it's a one size fits all approach. The message seems 437 00:28:00,359 --> 00:28:03,520 Speaker 1: to be that driver less cars alone are not enough 438 00:28:03,600 --> 00:28:07,159 Speaker 1: to help solve traffic problems, particularly in cities that have 439 00:28:07,840 --> 00:28:13,040 Speaker 1: lots of cars in them Atlanta, Houston, Los Angeles. Cities 440 00:28:13,080 --> 00:28:16,639 Speaker 1: need to invest in public transportation and alternative means to 441 00:28:16,680 --> 00:28:21,159 Speaker 1: get around, particularly for short distances, and that will reduce 442 00:28:21,240 --> 00:28:24,320 Speaker 1: the demand for car services as well as just cars 443 00:28:24,320 --> 00:28:27,960 Speaker 1: in general. But without that investment, we should expect to 444 00:28:27,960 --> 00:28:31,600 Speaker 1: see a lot more cars hitting the streets, human driven 445 00:28:31,720 --> 00:28:35,080 Speaker 1: or otherwise. I should point out, though, that these studies 446 00:28:35,080 --> 00:28:38,040 Speaker 1: do pretty much universally conclude that one thing we would 447 00:28:38,120 --> 00:28:44,880 Speaker 1: need less of with a proliferation of driverless cars would 448 00:28:44,880 --> 00:28:49,280 Speaker 1: be parking spaces. Because these studies all presume that we're 449 00:28:49,320 --> 00:28:53,080 Speaker 1: going to rely on ride haling services with these autonomous 450 00:28:53,080 --> 00:28:58,239 Speaker 1: cars rather than owning our own autonomous vehicle, we're not 451 00:28:58,400 --> 00:29:00,080 Speaker 1: likely to go out and buy one. They're going to 452 00:29:00,120 --> 00:29:03,560 Speaker 1: be way too expensive. It makes way more sense to 453 00:29:03,600 --> 00:29:07,600 Speaker 1: make use of it in an as needed basis. So 454 00:29:07,720 --> 00:29:10,200 Speaker 1: if we're using them just when we need them, then 455 00:29:10,240 --> 00:29:13,480 Speaker 1: we don't have to park them anywhere. So the cars 456 00:29:13,480 --> 00:29:16,240 Speaker 1: will just drop us off at destinations and then whisk 457 00:29:16,360 --> 00:29:19,080 Speaker 1: us whisk off on their own to go pick up 458 00:29:19,120 --> 00:29:20,920 Speaker 1: somebody else while we go and do whatever it is 459 00:29:20,960 --> 00:29:23,120 Speaker 1: we need to do. There'll be no need to park, 460 00:29:23,560 --> 00:29:25,960 Speaker 1: so a lot of the space currently dedicated to parking 461 00:29:26,000 --> 00:29:29,280 Speaker 1: in various cities could be repurposed for other things, which 462 00:29:29,280 --> 00:29:33,840 Speaker 1: is kind of cool. One great argument for driver list 463 00:29:33,880 --> 00:29:38,360 Speaker 1: cars is accessibility. People who cannot drive for whatever reason 464 00:29:38,720 --> 00:29:41,920 Speaker 1: could potentially reclaim a lot of agency in their day 465 00:29:41,920 --> 00:29:47,800 Speaker 1: to day lives. The elderly people who have vision issues, 466 00:29:48,000 --> 00:29:51,840 Speaker 1: people like me who have anxiety issues and fear issues. 467 00:29:52,400 --> 00:29:57,040 Speaker 1: That's not just a benefit for those people, but for everyone. 468 00:29:57,080 --> 00:30:01,200 Speaker 1: Those people would otherwise never meet. It's benefit to society 469 00:30:01,240 --> 00:30:04,520 Speaker 1: because it allows them, these people who otherwise would have 470 00:30:04,560 --> 00:30:09,560 Speaker 1: trouble integrating and contributing to society to do so. I mean, 471 00:30:09,640 --> 00:30:12,320 Speaker 1: it may not be a big positive with me. If 472 00:30:12,360 --> 00:30:14,200 Speaker 1: I show up, you might roll your eyes and wonder 473 00:30:14,240 --> 00:30:17,320 Speaker 1: where the closest eggs it is. But for other people 474 00:30:17,800 --> 00:30:20,680 Speaker 1: it could be really cool. And that's a pretty darned 475 00:30:20,720 --> 00:30:25,840 Speaker 1: compelling argument. Now. One of the arguments against Thomas Cars 476 00:30:25,880 --> 00:30:29,160 Speaker 1: that I have heard is that if driverless cars are safe, 477 00:30:29,680 --> 00:30:33,640 Speaker 1: and if they reduce accidents, that would mean personal injury 478 00:30:33,640 --> 00:30:37,200 Speaker 1: attorneys will get less business. But while that might also 479 00:30:37,280 --> 00:30:40,000 Speaker 1: be true, I think it's pretty grim to suggest a 480 00:30:40,040 --> 00:30:45,520 Speaker 1: reduction in accidents is a bad thing. So I acknowledge 481 00:30:45,880 --> 00:30:49,240 Speaker 1: that it would create a hardship on people who are 482 00:30:49,280 --> 00:30:53,600 Speaker 1: in that profession, I do not necessarily agree that in 483 00:30:53,640 --> 00:30:57,200 Speaker 1: the end, that is the worst of all the alternatives. 484 00:30:58,200 --> 00:31:01,520 Speaker 1: Then there are the arguments that driverless cars would free 485 00:31:01,560 --> 00:31:05,280 Speaker 1: up passengers to do other stuff rather than drive, so, 486 00:31:05,320 --> 00:31:07,959 Speaker 1: in other words, you don't have to pay attention to driving, 487 00:31:08,000 --> 00:31:10,200 Speaker 1: so you can do whatever you need to do, like work, 488 00:31:10,320 --> 00:31:12,840 Speaker 1: or you watch movies, or you catch up on social media. 489 00:31:13,560 --> 00:31:16,160 Speaker 1: The biggest cautionary warning I think we should consider is 490 00:31:16,200 --> 00:31:19,040 Speaker 1: that driverless cars may not be effective in all the 491 00:31:19,200 --> 00:31:24,160 Speaker 1: environments and weather conditions that we encounter. Heavy rain, fog, 492 00:31:24,720 --> 00:31:29,080 Speaker 1: other weather events could damage or mislead sensors. And since 493 00:31:29,680 --> 00:31:33,440 Speaker 1: many autonomous vehicles like Waymost cars depend at least in 494 00:31:33,600 --> 00:31:37,440 Speaker 1: part on pre mapped information. In other words, the company 495 00:31:37,440 --> 00:31:41,040 Speaker 1: has already gone out and mapped out the regions in 496 00:31:41,160 --> 00:31:45,160 Speaker 1: which the cars are going to operate. That presents a 497 00:31:45,200 --> 00:31:49,440 Speaker 1: bit of a challenge. So on the good side, the 498 00:31:49,480 --> 00:31:51,400 Speaker 1: cars have a baseline to work from. It makes them 499 00:31:51,440 --> 00:31:55,800 Speaker 1: more accurate, more precise, at least as long as the 500 00:31:55,920 --> 00:32:01,560 Speaker 1: environment continues to match the data inside the database. But 501 00:32:01,680 --> 00:32:05,640 Speaker 1: things can change. Let's say there's a catastrophic event. Maybe 502 00:32:05,680 --> 00:32:08,960 Speaker 1: there's a fire or there's an earthquake. You know, we've 503 00:32:09,000 --> 00:32:12,320 Speaker 1: had both recently in the news as I record this. 504 00:32:12,400 --> 00:32:15,960 Speaker 1: In December two thou eighteen, you had the massive earthquake 505 00:32:15,960 --> 00:32:19,840 Speaker 1: in Alaska. You had the fires in California. These can 506 00:32:19,880 --> 00:32:25,160 Speaker 1: be transformative events, and it can change the terrain significantly 507 00:32:25,280 --> 00:32:27,720 Speaker 1: from what it used to be. And if a car 508 00:32:28,120 --> 00:32:33,280 Speaker 1: is at least partly dependent upon drawing information that has 509 00:32:33,320 --> 00:32:36,400 Speaker 1: been stored in a database to know where it is 510 00:32:36,440 --> 00:32:40,680 Speaker 1: within its environment, that can produce a real problem, especially 511 00:32:40,760 --> 00:32:43,840 Speaker 1: if there are now obstacles in the way. And if 512 00:32:43,880 --> 00:32:46,680 Speaker 1: you're talking about an event where you might need to 513 00:32:46,920 --> 00:32:51,600 Speaker 1: evacuate people, that could cause even more issues. Whether the 514 00:32:51,840 --> 00:32:54,520 Speaker 1: driver list cars are carrying anyone at the time or not, 515 00:32:54,720 --> 00:33:00,320 Speaker 1: they might be obstacles. So conceivably it could be more 516 00:33:00,440 --> 00:33:03,120 Speaker 1: dangerous to be in a driverless car in that kind 517 00:33:03,120 --> 00:33:06,840 Speaker 1: of situation than a human operated car. But here's another 518 00:33:06,920 --> 00:33:09,760 Speaker 1: crazy thing. We have to consider, what do we even 519 00:33:09,880 --> 00:33:13,880 Speaker 1: mean when we talk about driverless cars. I'll explain more 520 00:33:13,880 --> 00:33:15,920 Speaker 1: in just a bit, but let's take a quick break 521 00:33:16,200 --> 00:33:26,640 Speaker 1: to thank our sponsor. In two thousand fourteen, s a 522 00:33:26,880 --> 00:33:30,240 Speaker 1: E International, which is an organization that creates standards for 523 00:33:30,320 --> 00:33:35,240 Speaker 1: the automotive industry, published a document designated J three zero 524 00:33:35,320 --> 00:33:39,120 Speaker 1: one six and had the title Taxonomy and Definitions for 525 00:33:39,320 --> 00:33:43,600 Speaker 1: terms related to on road motor vehicle automated driving systems. 526 00:33:44,320 --> 00:33:48,080 Speaker 1: Other organizations, such as the United States National Highway Traffic 527 00:33:48,160 --> 00:33:52,440 Speaker 1: Safety Administration had likewise classified autonomous cars in several ways, 528 00:33:52,760 --> 00:33:56,640 Speaker 1: but ultimately the world has largely adopted this s a 529 00:33:56,800 --> 00:34:01,360 Speaker 1: E classification, which got an update in ten. And here's 530 00:34:01,360 --> 00:34:05,640 Speaker 1: how they classify the concept of autonomous cars. They define 531 00:34:05,640 --> 00:34:09,640 Speaker 1: it in varying levels of autonomy. You start at level 532 00:34:09,760 --> 00:34:11,759 Speaker 1: zero and you work up to level five, that is 533 00:34:11,920 --> 00:34:17,840 Speaker 1: least to most autonomous. So a level zero automobile can 534 00:34:17,920 --> 00:34:22,480 Speaker 1: have automatic warnings and may intervene very briefly, but there's 535 00:34:22,480 --> 00:34:26,520 Speaker 1: no ongoing control of the vehicle. A human driver maintains 536 00:34:26,560 --> 00:34:29,400 Speaker 1: control through the vast majority of any trips. So you 537 00:34:29,480 --> 00:34:32,759 Speaker 1: might have like an emergency braking system that immediately comes 538 00:34:32,760 --> 00:34:35,120 Speaker 1: on and turns off, but that's it. It's not going 539 00:34:35,160 --> 00:34:38,919 Speaker 1: to continuously control the car. A level one autonomous car 540 00:34:39,280 --> 00:34:41,120 Speaker 1: is one in which the car and the human driver 541 00:34:41,280 --> 00:34:44,759 Speaker 1: share control. A car with adaptive cruise control is an 542 00:34:44,760 --> 00:34:48,160 Speaker 1: example of a level one autonomous car. The human driver 543 00:34:48,280 --> 00:34:51,840 Speaker 1: continues to steer the vehicle while adaptive cruise control is 544 00:34:51,840 --> 00:34:55,439 Speaker 1: in is in effect, but the car itself is what's 545 00:34:55,520 --> 00:34:59,560 Speaker 1: establishing and maintaining the speed, so it's cooperative. A Level 546 00:34:59,600 --> 00:35:03,799 Speaker 1: two vehicle can take full control of all driving operations, 547 00:35:03,800 --> 00:35:08,279 Speaker 1: including steering, accelerating, and breaking, but the human driver is 548 00:35:08,320 --> 00:35:11,719 Speaker 1: supposed to monitor everything closely and be ready to take 549 00:35:11,760 --> 00:35:16,640 Speaker 1: over control if necessary. Google's driverless cars in the testing 550 00:35:16,680 --> 00:35:20,600 Speaker 1: phase fell into this category, as does still to this day. 551 00:35:20,600 --> 00:35:24,440 Speaker 1: Tesla's autopilot system is a is essentially a level two um, 552 00:35:24,600 --> 00:35:29,160 Speaker 1: or at least it's marketed as such. Level three autonomous 553 00:35:29,239 --> 00:35:32,640 Speaker 1: cars are one sophisticate enough that it is no longer 554 00:35:32,719 --> 00:35:37,279 Speaker 1: necessary for a human operator to monitor the situation and 555 00:35:37,360 --> 00:35:40,040 Speaker 1: keep eyes on everything. You can take your eyes off 556 00:35:40,120 --> 00:35:42,120 Speaker 1: the road, in other words, so you can turn your 557 00:35:42,120 --> 00:35:45,799 Speaker 1: focus to something else, at least temporarily. But the car 558 00:35:45,920 --> 00:35:49,319 Speaker 1: also can give an alert to the person in the 559 00:35:49,360 --> 00:35:51,880 Speaker 1: car that indicates they need to take over at some 560 00:35:52,040 --> 00:35:56,759 Speaker 1: given time, preferably not right the heck now, but more 561 00:35:56,880 --> 00:35:59,399 Speaker 1: like in five seconds, we're going to hand control back 562 00:35:59,440 --> 00:36:03,360 Speaker 1: over to you. Level four would allow for total vehicle 563 00:36:03,400 --> 00:36:06,440 Speaker 1: control to the extent that a person inside the car 564 00:36:06,800 --> 00:36:09,080 Speaker 1: could take a nap if they wanted to, but only 565 00:36:09,120 --> 00:36:13,040 Speaker 1: within specific geographical areas. In other words, there would be 566 00:36:13,120 --> 00:36:17,480 Speaker 1: a geo fencing feature built in, so if the vehicle 567 00:36:17,520 --> 00:36:20,359 Speaker 1: were to approach the boundary of that area, it would 568 00:36:20,360 --> 00:36:22,959 Speaker 1: alert the person in the car that their control would 569 00:36:22,960 --> 00:36:25,400 Speaker 1: be necessary. Otherwise the car would take care of everything. 570 00:36:26,160 --> 00:36:30,320 Speaker 1: This is the way way MOO one would work, because 571 00:36:30,719 --> 00:36:35,640 Speaker 1: it's working specifically within the city of Phoenix, Arizona. It's 572 00:36:35,640 --> 00:36:40,040 Speaker 1: been designed for that, it's been optimized and localized for that. 573 00:36:40,600 --> 00:36:43,520 Speaker 1: Level five would be a car that never needed to 574 00:36:43,680 --> 00:36:47,719 Speaker 1: seed control to a human operator. A Level five autonomous 575 00:36:47,719 --> 00:36:52,160 Speaker 1: car would have to be capable of following all traffic 576 00:36:52,239 --> 00:36:55,120 Speaker 1: laws wherever it may be, and go from city to 577 00:36:55,160 --> 00:36:58,879 Speaker 1: city to city without any issues. So when we say 578 00:36:58,960 --> 00:37:02,960 Speaker 1: driverless cars, we have to determine what level we're talking about. 579 00:37:03,200 --> 00:37:07,319 Speaker 1: It's not just sufficient to say an autonomous car. You 580 00:37:07,360 --> 00:37:11,440 Speaker 1: need to understand that autonomy is a spectrum. So a 581 00:37:11,520 --> 00:37:13,719 Speaker 1: level of five car would be ideal. It's a sort 582 00:37:13,719 --> 00:37:16,839 Speaker 1: of model I assume people tend to talk about when 583 00:37:16,840 --> 00:37:20,279 Speaker 1: they're discussing things like the reduction and accidents and the 584 00:37:20,320 --> 00:37:23,840 Speaker 1: improvement of traffic flow, but no one has quite reached 585 00:37:23,840 --> 00:37:27,640 Speaker 1: that level. In fact, that's a very difficult problem to solve. 586 00:37:28,360 --> 00:37:31,680 Speaker 1: Level four is more of the region that companies like 587 00:37:31,719 --> 00:37:34,279 Speaker 1: Weymo appear to operate within, or at least how they're 588 00:37:34,320 --> 00:37:37,720 Speaker 1: presenting themselves. The driver less vehicle with no human safety 589 00:37:37,760 --> 00:37:40,160 Speaker 1: operator would have to work on level four at the 590 00:37:40,280 --> 00:37:43,680 Speaker 1: very least, or else you're gambling with a person's life, 591 00:37:43,719 --> 00:37:46,960 Speaker 1: because again, level three would occasionally hand over control back 592 00:37:47,000 --> 00:37:50,000 Speaker 1: to a human driver. So if you're using a driverless 593 00:37:50,120 --> 00:37:53,200 Speaker 1: ride hailing service and there's no safety operator in there, 594 00:37:53,760 --> 00:37:55,480 Speaker 1: it has to be a level four if it's going 595 00:37:55,520 --> 00:37:58,440 Speaker 1: to be responsible. If the safe operation of the vehicle 596 00:37:59,080 --> 00:38:01,400 Speaker 1: means that it has to a hand over control to 597 00:38:01,440 --> 00:38:04,120 Speaker 1: a person on occasion. That doesn't really work for a 598 00:38:04,200 --> 00:38:08,240 Speaker 1: ride hailing service obviously, unless you've also got a backup 599 00:38:08,320 --> 00:38:10,239 Speaker 1: driver in the car, at which case you start to 600 00:38:10,280 --> 00:38:12,680 Speaker 1: ask the question, why are we even bothering with the 601 00:38:12,719 --> 00:38:16,239 Speaker 1: autonomous part? At least from a service perspective, you might 602 00:38:16,360 --> 00:38:20,360 Speaker 1: argue it's useful from gathering information to make it better, 603 00:38:21,239 --> 00:38:23,880 Speaker 1: But when you're talking about an actual ride hailing service, 604 00:38:24,480 --> 00:38:27,799 Speaker 1: that raises a lot of questions. So level four is 605 00:38:27,800 --> 00:38:30,759 Speaker 1: where we really need to be At the least, the 606 00:38:30,840 --> 00:38:34,200 Speaker 1: question is are we really confident that we're at level 607 00:38:34,280 --> 00:38:38,759 Speaker 1: four now? Now it is possible. I don't mean to 608 00:38:38,800 --> 00:38:42,319 Speaker 1: suggest that there's no way we're at level four, but 609 00:38:42,400 --> 00:38:45,920 Speaker 1: there's a lot to a lot of standards we have 610 00:38:45,960 --> 00:38:49,120 Speaker 1: to meet to truly be level four. There's now there 611 00:38:49,160 --> 00:38:52,320 Speaker 1: are a lot of very sophisticated technologies that go into 612 00:38:52,360 --> 00:38:59,680 Speaker 1: these cars, and there's and equally astonishingly sophisticated series of algorithms. 613 00:38:59,719 --> 00:39:04,240 Speaker 1: Guy eating this technology and way Moo's approach of rolling 614 00:39:04,239 --> 00:39:08,080 Speaker 1: out in that specific region of Phoenix, Arizona means that 615 00:39:08,120 --> 00:39:10,839 Speaker 1: the cars can rely upon that localized data to help 616 00:39:11,000 --> 00:39:13,560 Speaker 1: guide decisions. The car doesn't have to be so versatile 617 00:39:13,719 --> 00:39:16,720 Speaker 1: that can adapt to different cities with different traffic laws 618 00:39:16,719 --> 00:39:19,600 Speaker 1: and cultures. And a localized approach means that it's even 619 00:39:19,680 --> 00:39:22,560 Speaker 1: possible even to program in rules that might only apply 620 00:39:22,640 --> 00:39:26,800 Speaker 1: to specific parts of Phoenix, like specific intersections. So, for example, 621 00:39:26,880 --> 00:39:30,320 Speaker 1: to let you know what I'm talking about here, in Atlanta, Georgia, 622 00:39:30,320 --> 00:39:33,440 Speaker 1: where I live, there are certain communities where it is 623 00:39:33,440 --> 00:39:36,200 Speaker 1: against the law to make a right hand turn at 624 00:39:36,200 --> 00:39:41,160 Speaker 1: a red light within that community. There's a community in 625 00:39:41,440 --> 00:39:45,640 Speaker 1: the Atlanta area called Avondale Estates is against the law 626 00:39:45,680 --> 00:39:48,800 Speaker 1: to take a right turn on a red light there. 627 00:39:49,440 --> 00:39:52,120 Speaker 1: It doesn't matter what the traffic situation is. If you're 628 00:39:52,120 --> 00:39:55,279 Speaker 1: in Avendale, states don't make a right on red. But 629 00:39:55,400 --> 00:39:58,400 Speaker 1: in other parts of Atlanta, making a right on a 630 00:39:58,480 --> 00:40:02,120 Speaker 1: red is perfectly legal. So a driver less car service 631 00:40:02,200 --> 00:40:05,239 Speaker 1: in the Atlanta area would need to have that information 632 00:40:05,280 --> 00:40:08,480 Speaker 1: built in so it could integrate into existing traffic while 633 00:40:08,520 --> 00:40:13,160 Speaker 1: obeying the law. It wouldn't hold up traffic in neighborhoods 634 00:40:13,200 --> 00:40:15,400 Speaker 1: where a right on red was permissible, and it wouldn't 635 00:40:15,400 --> 00:40:17,520 Speaker 1: break the law by taking a right on red in 636 00:40:17,520 --> 00:40:20,360 Speaker 1: a place where it wasn't that would be a requirement 637 00:40:20,880 --> 00:40:23,680 Speaker 1: that might work if the service is truly local. The 638 00:40:23,719 --> 00:40:26,640 Speaker 1: service might also be bounded by a geo fencing feature 639 00:40:26,640 --> 00:40:29,560 Speaker 1: like I mentioned that prevents writers from specifying a drop 640 00:40:29,560 --> 00:40:32,880 Speaker 1: off point beyond a certain distance of the center of operations. So, 641 00:40:32,920 --> 00:40:35,640 Speaker 1: in other words, if I said I'm in Atlanta, I 642 00:40:35,640 --> 00:40:36,920 Speaker 1: want to get picked up, and I want to be 643 00:40:37,000 --> 00:40:42,040 Speaker 1: dropped off in Charleston, in a different state, then if 644 00:40:42,080 --> 00:40:45,600 Speaker 1: I tried to select that distance, that location, or that destination, 645 00:40:46,120 --> 00:40:48,280 Speaker 1: the app could tell me I'm sorry, that's not possible. 646 00:40:48,960 --> 00:40:50,480 Speaker 1: So you might find out you can get a ride 647 00:40:50,480 --> 00:40:53,240 Speaker 1: with a driverless car, but only across town, not across 648 00:40:53,719 --> 00:40:56,759 Speaker 1: the state or into another state. And it's also good 649 00:40:56,800 --> 00:41:02,200 Speaker 1: to remember, just to couch our expectation, what the conditions 650 00:41:02,200 --> 00:41:05,240 Speaker 1: of Phoenix, Arizona are like. The city has an average 651 00:41:05,280 --> 00:41:08,920 Speaker 1: annual rainfall of eight point zero four inches for the 652 00:41:09,160 --> 00:41:14,440 Speaker 1: entirety of the year. In Atlanta, the annual rainfall average 653 00:41:14,520 --> 00:41:18,399 Speaker 1: is closer to fifty inches. Now, what that tells us 654 00:41:18,560 --> 00:41:21,279 Speaker 1: is that you get fewer rainy days and Phoenix than 655 00:41:21,320 --> 00:41:24,320 Speaker 1: you would in Atlanta, and presumably you would have fewer 656 00:41:24,400 --> 00:41:27,759 Speaker 1: days with very heavy rain, which is important because that 657 00:41:27,800 --> 00:41:32,400 Speaker 1: could interfere with a car's sensors. The annual low temperature 658 00:41:32,400 --> 00:41:35,759 Speaker 1: and Phoenix is sixty three point four degrees fahrenheit, which 659 00:41:35,800 --> 00:41:39,839 Speaker 1: is about seventeen point four degrees celsius. Even in the 660 00:41:39,840 --> 00:41:43,200 Speaker 1: coldest month of the year for Phoenix, which is December, 661 00:41:43,600 --> 00:41:47,120 Speaker 1: the average low is forty five degrees fahrenheit or seven 662 00:41:47,160 --> 00:41:51,600 Speaker 1: point two degrees celsius, which is chilly for someone like 663 00:41:51,680 --> 00:41:53,799 Speaker 1: me who grew up in the Southeast. I might want 664 00:41:53,840 --> 00:41:56,960 Speaker 1: to wear a jacket, maybe even a light coat. But 665 00:41:57,040 --> 00:42:00,799 Speaker 1: it's not freezing, right, you haven't dropped below the temperature 666 00:42:00,840 --> 00:42:04,760 Speaker 1: of freezing. The likelihood that the driver lest cars and Phoenix, 667 00:42:04,760 --> 00:42:09,320 Speaker 1: Arizona would encounter conditions like ice or snow is very low, 668 00:42:09,880 --> 00:42:15,279 Speaker 1: not impossible, but very unlikely. So you could argue that 669 00:42:15,440 --> 00:42:18,000 Speaker 1: rolling out a driverless car service in Phoenix is like 670 00:42:18,320 --> 00:42:21,520 Speaker 1: riding a bike with training wheels. But on the other hand, 671 00:42:21,600 --> 00:42:23,960 Speaker 1: it makes way more sense to me to start off 672 00:42:23,960 --> 00:42:26,960 Speaker 1: in areas where conditions are pretty consistent so that you 673 00:42:27,000 --> 00:42:31,120 Speaker 1: can continue to gather information about driving conditions and driver 674 00:42:31,280 --> 00:42:35,399 Speaker 1: behaviors and work with that knowledge and build it into 675 00:42:35,520 --> 00:42:39,640 Speaker 1: future algorithms and improvements. That might mean we'll see a 676 00:42:39,920 --> 00:42:45,440 Speaker 1: very gradual rollout of autonomous car technology in cities that 677 00:42:45,520 --> 00:42:49,839 Speaker 1: are most amenable to it, let's say, so it might 678 00:42:49,840 --> 00:42:52,319 Speaker 1: be a very controlled rollout, and it might be a 679 00:42:52,320 --> 00:42:55,279 Speaker 1: long time before we see it everywhere. Now, I think 680 00:42:55,280 --> 00:42:59,080 Speaker 1: I'm going to close with one other big point. There 681 00:42:59,120 --> 00:43:02,440 Speaker 1: are tons more that I could talk about, obviously, but 682 00:43:02,480 --> 00:43:05,440 Speaker 1: the one I want to talk about here. Another warning 683 00:43:05,960 --> 00:43:09,080 Speaker 1: is I just mentioned the idea of Weymo gathering useful 684 00:43:09,120 --> 00:43:13,520 Speaker 1: information from the driverless car service. It's it's launching in Phoenix, 685 00:43:14,840 --> 00:43:20,080 Speaker 1: but that also ties into information about us as potential customers. 686 00:43:20,840 --> 00:43:24,320 Speaker 1: Weymo is part of Alphabet. Alphabet's the holding company for Google. 687 00:43:24,840 --> 00:43:29,280 Speaker 1: Google is ultimately in the business of information. So consider 688 00:43:29,360 --> 00:43:33,839 Speaker 1: this for a moment. Using a driverless car service, it's 689 00:43:33,880 --> 00:43:36,800 Speaker 1: just like using any other ride hailing service. It requires 690 00:43:36,800 --> 00:43:40,440 Speaker 1: that you establish an account with that service. That account 691 00:43:40,480 --> 00:43:43,560 Speaker 1: is tied to your identity. You have to be able 692 00:43:43,600 --> 00:43:46,440 Speaker 1: to pay for the service, so ultimately this has to 693 00:43:46,480 --> 00:43:50,120 Speaker 1: be linked to some legitimate form of payment, which in 694 00:43:50,160 --> 00:43:54,359 Speaker 1: turn tends to be linked to your specific identity. And 695 00:43:54,960 --> 00:43:57,920 Speaker 1: you use an app to call for a ride, you 696 00:43:58,000 --> 00:44:00,239 Speaker 1: designate where you're going to be picked up and where 697 00:44:00,239 --> 00:44:03,480 Speaker 1: you want to be dropped off. That means part of 698 00:44:03,520 --> 00:44:07,239 Speaker 1: your payment is not in money, it's actually in the 699 00:44:07,280 --> 00:44:11,680 Speaker 1: information you're giving the service. The service learns where you're 700 00:44:11,719 --> 00:44:15,719 Speaker 1: coming from, where you're going, and when you were traveling, 701 00:44:16,360 --> 00:44:20,200 Speaker 1: and that is valuable information. You can bet it will 702 00:44:20,200 --> 00:44:23,920 Speaker 1: get tied to other stuff like advertising. So that's a 703 00:44:24,000 --> 00:44:26,800 Speaker 1: very good thing to remember too. It's not to say 704 00:44:27,040 --> 00:44:30,879 Speaker 1: don't use the service, but be aware of everything that's 705 00:44:30,920 --> 00:44:35,319 Speaker 1: going on, be an informed consumer. Driver lest cars might 706 00:44:35,360 --> 00:44:38,480 Speaker 1: transform our world, but they might do so in ways 707 00:44:38,520 --> 00:44:41,839 Speaker 1: that aren't entirely to our liking in the future. Like 708 00:44:41,880 --> 00:44:44,160 Speaker 1: I said, I'll probably do some more episodes about this 709 00:44:44,560 --> 00:44:46,920 Speaker 1: to touch on some of those other elements that I 710 00:44:46,960 --> 00:44:48,600 Speaker 1: mentioned at the top of the show but didn't really 711 00:44:48,600 --> 00:44:51,919 Speaker 1: get a chance to go into. And I didn't really 712 00:44:51,960 --> 00:44:55,279 Speaker 1: talk about things like the impact that driverless cars might 713 00:44:55,360 --> 00:44:59,799 Speaker 1: have on employment, for example, so people who do deliveries, 714 00:45:00,200 --> 00:45:05,160 Speaker 1: you know, postal service, the trucking industry, the taxi industry, 715 00:45:05,360 --> 00:45:08,640 Speaker 1: right hailing businesses. I haven't talked about how driver lest 716 00:45:08,719 --> 00:45:11,720 Speaker 1: cars could potentially have a big impact on those those 717 00:45:11,920 --> 00:45:16,319 Speaker 1: various industries, but I have rambled about autonomous cars for 718 00:45:16,360 --> 00:45:18,920 Speaker 1: more than a week now, so there's plenty for us 719 00:45:19,000 --> 00:45:21,719 Speaker 1: to come back to later. For the time being, we're 720 00:45:21,719 --> 00:45:25,200 Speaker 1: gonna put this to bed. We're gonna switch to other topics. 721 00:45:25,640 --> 00:45:28,120 Speaker 1: I'm gonna take the air out of an entirely different 722 00:45:28,640 --> 00:45:32,000 Speaker 1: futuristic idea in our next episode. If you guys have 723 00:45:32,120 --> 00:45:35,520 Speaker 1: suggestions for topics I should cover in future episodes of 724 00:45:35,520 --> 00:45:38,719 Speaker 1: tech Stuff, whether it's a technology, a company, Maybe there's 725 00:45:38,760 --> 00:45:40,600 Speaker 1: a person in tech I should talk about. Maybe there's 726 00:45:40,640 --> 00:45:43,160 Speaker 1: someone I should interview or have on as a guest host. 727 00:45:43,800 --> 00:45:46,439 Speaker 1: Let me know. Send me an email the addresses tech 728 00:45:46,480 --> 00:45:49,879 Speaker 1: Stuff at how stuff works dot com. Head on over 729 00:45:49,920 --> 00:45:53,000 Speaker 1: to our website that's tech Stuff podcast dot com. You'll 730 00:45:53,040 --> 00:45:55,440 Speaker 1: find other ways to contact me, plus a link to 731 00:45:55,480 --> 00:45:58,439 Speaker 1: our store. Remember every purchase you make goes to help 732 00:45:58,480 --> 00:46:01,160 Speaker 1: the show, and we greatly appreciate it. And I'll talk 733 00:46:01,200 --> 00:46:09,520 Speaker 1: to you again really so soon for more on this 734 00:46:09,680 --> 00:46:12,200 Speaker 1: and thousands of other topics. Is that how stuff works 735 00:46:12,239 --> 00:46:22,520 Speaker 1: dot com.