1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,920 --> 00:00:19,080 Speaker 1: Jonathan Strickland and how the tech are you? Now? Before 4 00:00:19,079 --> 00:00:21,640 Speaker 1: we get started with today's episode, I wanted to let 5 00:00:21,720 --> 00:00:24,200 Speaker 1: y'all know about a new way to get in touch 6 00:00:24,600 --> 00:00:27,240 Speaker 1: with this podcast, and I figured it'd be best to 7 00:00:27,240 --> 00:00:29,720 Speaker 1: do it up front, because I suspect some of you 8 00:00:29,800 --> 00:00:35,440 Speaker 1: folks turn off the show when you hear me wrapping up. Shame, shame, Shane. Now, 9 00:00:35,640 --> 00:00:38,920 Speaker 1: if you download the I Heart Radio app and navigate 10 00:00:38,960 --> 00:00:41,920 Speaker 1: to tech Stuff, which you can do by searching text 11 00:00:41,960 --> 00:00:45,879 Speaker 1: stuff and then selecting the entry that's under podcasts, you 12 00:00:45,920 --> 00:00:50,320 Speaker 1: will see that there's a little microphone icon on that page. 13 00:00:50,600 --> 00:00:54,960 Speaker 1: And similarly, if you click into any individual episode, you'll 14 00:00:55,000 --> 00:00:59,000 Speaker 1: see a microphone icon there. Now, if you tap on 15 00:00:59,040 --> 00:01:01,920 Speaker 1: that icon, you can leave a voice message of up 16 00:01:01,920 --> 00:01:04,959 Speaker 1: to thirty seconds for the show. So the microphone on 17 00:01:05,000 --> 00:01:08,399 Speaker 1: the main tech Stuff page goes into a general folder. 18 00:01:08,440 --> 00:01:10,959 Speaker 1: This might be like if you wanted to suggest an 19 00:01:10,959 --> 00:01:14,520 Speaker 1: episode about a specific topic. If you leave a message 20 00:01:14,560 --> 00:01:18,800 Speaker 1: on a specific episode, well it tags that episode and 21 00:01:18,920 --> 00:01:21,959 Speaker 1: I'll still see it right away, but it means that 22 00:01:22,000 --> 00:01:25,480 Speaker 1: I'll know which episode you're referring to, and maybe you 23 00:01:25,520 --> 00:01:27,840 Speaker 1: want to leave a comment that's specific to that episode. 24 00:01:27,880 --> 00:01:31,600 Speaker 1: Maybe it's an addition, it could be a correction, it 25 00:01:31,640 --> 00:01:34,680 Speaker 1: could be a cool I didn't know that? Did you 26 00:01:34,720 --> 00:01:37,360 Speaker 1: also know this? It could be anything like that. Now, 27 00:01:37,400 --> 00:01:41,880 Speaker 1: future episodes might include voice clips unless you specifically ask 28 00:01:41,959 --> 00:01:43,920 Speaker 1: me not to use it, in which case I will 29 00:01:44,000 --> 00:01:47,440 Speaker 1: not use it. But if it's something where you're leaving 30 00:01:47,880 --> 00:01:50,840 Speaker 1: a fun message that could prompt an episode, I would 31 00:01:50,880 --> 00:01:53,400 Speaker 1: love to be able to start including those in tech stuff. 32 00:01:53,840 --> 00:01:56,200 Speaker 1: It's a great way to send suggestions for topics, to 33 00:01:56,600 --> 00:02:00,000 Speaker 1: comments on episodes, to corrections I mean, I do sometimes 34 00:02:00,040 --> 00:02:03,840 Speaker 1: get things wrong, or maybe you just want to say howdy. Also, 35 00:02:03,840 --> 00:02:07,280 Speaker 1: in case you're curious, the only people who have access 36 00:02:07,360 --> 00:02:11,239 Speaker 1: to these messages are myself and my super producer Tari, 37 00:02:11,480 --> 00:02:16,080 Speaker 1: so there aren't any other people listening to these in advance. Also, please, 38 00:02:16,160 --> 00:02:20,000 Speaker 1: if you do use this, be civil, even with the corrections, 39 00:02:20,160 --> 00:02:22,760 Speaker 1: because I do have feelings and I look forward to 40 00:02:22,880 --> 00:02:27,040 Speaker 1: hearing from you. Podcasting is a fun medium because listeners 41 00:02:27,040 --> 00:02:30,760 Speaker 1: can often feel like the host is talking directly to them. 42 00:02:30,800 --> 00:02:33,400 Speaker 1: I know a lot of people who listen with headphones 43 00:02:33,480 --> 00:02:36,640 Speaker 1: on and that's a very intimate experience, right, I'm speaking 44 00:02:36,639 --> 00:02:39,640 Speaker 1: directly into your ears in that case. I know I 45 00:02:39,720 --> 00:02:42,360 Speaker 1: have experienced this in the past when I've listened to podcasts. 46 00:02:42,400 --> 00:02:44,200 Speaker 1: In fact, there are podcasts I listened to where I 47 00:02:44,200 --> 00:02:47,119 Speaker 1: do feel like I'm just a silent person in a room, 48 00:02:47,120 --> 00:02:50,079 Speaker 1: which honestly isn't very much like me. But you get 49 00:02:50,080 --> 00:02:54,520 Speaker 1: what I say. And now this tool lets you talk 50 00:02:54,600 --> 00:02:58,600 Speaker 1: back to me. And because it's just you on the microphone, 51 00:02:59,080 --> 00:03:02,200 Speaker 1: I'm not capable all of interrupting you, which is a 52 00:03:02,320 --> 00:03:07,440 Speaker 1: big bonus over having a real life conversation with me. Anyway. 53 00:03:07,520 --> 00:03:10,760 Speaker 1: Once again, that is the I Heart radio app with 54 00:03:10,840 --> 00:03:13,800 Speaker 1: the talk back feature, that little microphone icon that you 55 00:03:13,840 --> 00:03:18,279 Speaker 1: can just tap. But let's get on with today's episode. 56 00:03:18,880 --> 00:03:21,639 Speaker 1: And I really wanted to cover this topic and give 57 00:03:21,680 --> 00:03:23,600 Speaker 1: a bit of an update because I feel it's one 58 00:03:24,320 --> 00:03:30,600 Speaker 1: that is often misunderstood and frequently misrepresented. Um and that 59 00:03:30,760 --> 00:03:34,440 Speaker 1: is the different levels of autonomous driving and the state 60 00:03:34,520 --> 00:03:37,360 Speaker 1: of autonomous driving where we're at. I occasionally do these 61 00:03:37,480 --> 00:03:42,840 Speaker 1: updates because I think it's very easy to get lost 62 00:03:43,040 --> 00:03:47,320 Speaker 1: in marketing and pr speak and even some you know, 63 00:03:47,360 --> 00:03:52,040 Speaker 1: political movements that suggests we might be further along than 64 00:03:52,080 --> 00:03:54,920 Speaker 1: where we really are now. First of all, let's get 65 00:03:54,960 --> 00:03:59,240 Speaker 1: this out of the way. Developing autonomous vehicle technology is 66 00:03:59,280 --> 00:04:02,560 Speaker 1: a good idea. Here in the United States, the National 67 00:04:02,680 --> 00:04:06,320 Speaker 1: Highway Traffic Safety Administration or n h t s A 68 00:04:07,240 --> 00:04:11,480 Speaker 1: tracks the number of motor vehicle traffic fatalities year over year. 69 00:04:12,200 --> 00:04:15,960 Speaker 1: The number of deaths due to vehicle accidents is always 70 00:04:16,080 --> 00:04:20,080 Speaker 1: in the thirty thousand range or higher. So that means 71 00:04:20,520 --> 00:04:23,400 Speaker 1: more than thirty thousand people here in the United States 72 00:04:23,480 --> 00:04:27,800 Speaker 1: die every year due to traffic accidents, and if you 73 00:04:27,920 --> 00:04:32,400 Speaker 1: just skim recent figures, that really reveals something counterintuitive. The 74 00:04:32,480 --> 00:04:36,080 Speaker 1: year with the most traffic fatalities over the last decade 75 00:04:36,680 --> 00:04:40,760 Speaker 1: is twenty twenty. Now I should add that I could 76 00:04:40,800 --> 00:04:44,520 Speaker 1: not find an estimate for one, not for the full year, 77 00:04:44,720 --> 00:04:48,080 Speaker 1: only for part of the year, so one very well 78 00:04:48,200 --> 00:04:52,119 Speaker 1: might have taken the title away from twenty twenty. But yeah, 79 00:04:52,480 --> 00:04:57,040 Speaker 1: the first year of the pandemic saw the most traffic 80 00:04:57,040 --> 00:05:00,719 Speaker 1: fatalities in more than a decade. And you might imagine 81 00:05:00,720 --> 00:05:04,000 Speaker 1: that fatalities for that year would actually be down due 82 00:05:04,040 --> 00:05:07,919 Speaker 1: to more people staying at home, But you're wrong. The 83 00:05:07,960 --> 00:05:11,159 Speaker 1: total that year was thirty eight thousand, six hundred eighty 84 00:05:11,400 --> 00:05:14,880 Speaker 1: estimated deaths. The next highest year would be back in 85 00:05:15,360 --> 00:05:19,000 Speaker 1: sixteen that had thirty seven thousand, eight hundred six deaths, 86 00:05:19,040 --> 00:05:22,560 Speaker 1: almost a thousand fewer. The n h T s A 87 00:05:22,560 --> 00:05:25,680 Speaker 1: also figures out that the fatality rate per one hundred 88 00:05:25,720 --> 00:05:29,080 Speaker 1: million vehicle miles traveled is a good metric. Uh. This 89 00:05:29,160 --> 00:05:33,600 Speaker 1: is v MT, which essentially tells you how frequent do 90 00:05:33,680 --> 00:05:37,880 Speaker 1: these accidents occur, how many millions of miles are are 91 00:05:38,320 --> 00:05:42,440 Speaker 1: happening in between these fatalities. Keeping in mind that's across 92 00:05:42,520 --> 00:05:46,039 Speaker 1: the entire United States, so it's taking into account all 93 00:05:46,120 --> 00:05:51,640 Speaker 1: the vehicles traveling across the country, and again had the 94 00:05:51,720 --> 00:05:56,160 Speaker 1: highest frequency over the last decade with an astonishing one 95 00:05:56,360 --> 00:06:01,320 Speaker 1: point three seven fatalities per one hundred million vehicle miles traveled. 96 00:06:02,240 --> 00:06:04,440 Speaker 1: The next highest on that goes all the way back 97 00:06:04,480 --> 00:06:07,440 Speaker 1: to two thousand eight with one point to six. So 98 00:06:07,520 --> 00:06:10,080 Speaker 1: keep in mind like that means more than a decade 99 00:06:10,160 --> 00:06:15,560 Speaker 1: of innovation and improvements in driver assist features and things 100 00:06:15,600 --> 00:06:19,039 Speaker 1: of that nature ended up not making a difference because 101 00:06:19,080 --> 00:06:25,280 Speaker 1: we saw more fatalities per miles traveled than ever well 102 00:06:25,279 --> 00:06:28,280 Speaker 1: at least since two thousand eight, Because that's as far 103 00:06:28,360 --> 00:06:30,840 Speaker 1: back as as the the sheet I was looking at 104 00:06:30,839 --> 00:06:37,039 Speaker 1: wind So people were driving less in but they were 105 00:06:37,279 --> 00:06:41,679 Speaker 1: dying in accidents more frequently. And it didn't get better 106 00:06:41,880 --> 00:06:45,960 Speaker 1: in one either. During the first nine months of one, 107 00:06:46,000 --> 00:06:50,400 Speaker 1: traffic fatalities rose by twelve percent. So within those first 108 00:06:50,440 --> 00:06:53,599 Speaker 1: nine months there were an estimated thirty one thou seven 109 00:06:53,880 --> 00:06:58,360 Speaker 1: twenty deaths. Again just nine months, not a full year. Now, 110 00:06:58,400 --> 00:07:01,400 Speaker 1: I do not have more recent data to pull from, 111 00:07:01,400 --> 00:07:04,240 Speaker 1: so I'm not certain how the year ended out, but 112 00:07:04,360 --> 00:07:10,120 Speaker 1: that statistic is terrible and sobering. Here's another bad statistic. 113 00:07:10,720 --> 00:07:14,280 Speaker 1: The c d C says that road traffic crashes are 114 00:07:14,320 --> 00:07:17,360 Speaker 1: a leading cause of death here in the United States 115 00:07:17,680 --> 00:07:21,720 Speaker 1: for people aged one to fifty four. Now, these deaths 116 00:07:22,200 --> 00:07:27,840 Speaker 1: clearly have a massive impact. Beyond the obvious tragedy of 117 00:07:27,920 --> 00:07:33,160 Speaker 1: losing someone or multiple someone's in a crash. There's the 118 00:07:33,240 --> 00:07:36,720 Speaker 1: impact on that person's family and friends. That's an incredible 119 00:07:36,800 --> 00:07:42,120 Speaker 1: emotional impact on them. If that someone who died in 120 00:07:42,160 --> 00:07:46,160 Speaker 1: the crash were employed, well, there's an impact on their 121 00:07:46,280 --> 00:07:50,640 Speaker 1: job and on the economy as well. There's this overall 122 00:07:50,680 --> 00:07:56,160 Speaker 1: economic impact due to fatalities and accidents in general. The 123 00:07:56,280 --> 00:08:00,280 Speaker 1: National Safety Council estimates that the average economic cost of 124 00:08:00,320 --> 00:08:05,960 Speaker 1: a traffic fatality is one point seven five million dollars 125 00:08:06,000 --> 00:08:12,040 Speaker 1: per incident. Now, this takes into account everything from medical expenses, 126 00:08:12,120 --> 00:08:16,760 Speaker 1: to motor vehicle damage, to insurance costs, to wage and 127 00:08:16,840 --> 00:08:21,600 Speaker 1: productivity losses, and more. So, if we take this average 128 00:08:21,600 --> 00:08:24,720 Speaker 1: figure of one point seven five million dollars and we 129 00:08:24,840 --> 00:08:28,320 Speaker 1: multiply it by the thirty eight thousand, six eight estimated 130 00:08:28,360 --> 00:08:32,440 Speaker 1: deaths in twenty twenty, we get an economic impact of 131 00:08:32,559 --> 00:08:37,360 Speaker 1: sixty seven point seven billion dollars. That's the cost of 132 00:08:37,400 --> 00:08:41,000 Speaker 1: those fatal traffic accidents. When you take in all traffic accidents, 133 00:08:41,040 --> 00:08:45,040 Speaker 1: by the way, the amount gets over two billion dollars. 134 00:08:45,080 --> 00:08:47,720 Speaker 1: So there are a lot of reasons we would want 135 00:08:47,760 --> 00:08:51,160 Speaker 1: to reduce or eliminate those traffic fatalities. I mean, first 136 00:08:51,200 --> 00:08:55,840 Speaker 1: and foremost, obviously, we don't want people to die. We 137 00:08:55,880 --> 00:08:59,160 Speaker 1: would rather not see that happen. We would rather those 138 00:08:59,160 --> 00:09:03,959 Speaker 1: people go on to lead healthy, productive lives, surrounded by 139 00:09:04,000 --> 00:09:06,640 Speaker 1: friends and family, and we would not want to see 140 00:09:06,679 --> 00:09:10,120 Speaker 1: that end. Less importantly, we would not want to see 141 00:09:10,160 --> 00:09:14,280 Speaker 1: that negative economic impact. And we also would rather not 142 00:09:14,360 --> 00:09:18,079 Speaker 1: have the thousand other problems that come with traffic accidents, 143 00:09:18,080 --> 00:09:21,840 Speaker 1: such as the effect on traffic congestion and that sort 144 00:09:21,880 --> 00:09:24,760 Speaker 1: of thing. And there's one more important fact to consider. 145 00:09:25,880 --> 00:09:31,520 Speaker 1: According to research, at least of all traffic accidents are 146 00:09:31,559 --> 00:09:35,080 Speaker 1: caused primarily by human error. In other words, it's not 147 00:09:35,160 --> 00:09:38,800 Speaker 1: a mechanical failure. It's not that a vehicle suddenly lost 148 00:09:38,880 --> 00:09:43,080 Speaker 1: functionality in some way and that that led to the accident. 149 00:09:43,600 --> 00:09:48,040 Speaker 1: More than are caused by someone making a mistake. And 150 00:09:48,080 --> 00:09:51,520 Speaker 1: that's at least Some research pushes that even higher. The 151 00:09:51,600 --> 00:09:57,120 Speaker 1: highest I have seen suggests that eight per cent of 152 00:09:57,200 --> 00:10:02,520 Speaker 1: all accidents are call by human error. Um, so somewhere 153 00:10:02,559 --> 00:10:07,840 Speaker 1: in that range is probably the truth. Right, That's still enormous, 154 00:10:07,880 --> 00:10:11,160 Speaker 1: and that's where the sales pitch for autonomous vehicles comes in. 155 00:10:11,520 --> 00:10:15,199 Speaker 1: Because what if you had a computer controlled vehicle capable 156 00:10:15,640 --> 00:10:18,720 Speaker 1: of traveling at a safe speed and a safe distance 157 00:10:18,760 --> 00:10:21,600 Speaker 1: from other vehicles. It would be able to pay equal 158 00:10:21,640 --> 00:10:24,559 Speaker 1: amounts of attention to all areas around the vehicle. It 159 00:10:24,559 --> 00:10:27,720 Speaker 1: wouldn't just be paying attention to wherever its eyes were, 160 00:10:27,800 --> 00:10:30,640 Speaker 1: because it would have eyes all over. They could react 161 00:10:30,920 --> 00:10:33,960 Speaker 1: much more quickly than any human could, and it could 162 00:10:33,960 --> 00:10:37,920 Speaker 1: take action that could avoid an accident. You would remove 163 00:10:38,280 --> 00:10:41,400 Speaker 1: human error as a contributing factor for accidents, and since 164 00:10:41,440 --> 00:10:45,840 Speaker 1: we know human error accounts for more than of accidents, 165 00:10:46,320 --> 00:10:50,720 Speaker 1: that would, by extension, eliminate the vast majority of accidents 166 00:10:50,720 --> 00:10:53,920 Speaker 1: on the road. If you had enough of these vehicles 167 00:10:53,960 --> 00:10:56,720 Speaker 1: on the road, perhaps not replacing all of them, but 168 00:10:56,840 --> 00:11:00,560 Speaker 1: having a significant percentage of vehicles being autonomous, you would 169 00:11:00,600 --> 00:11:07,000 Speaker 1: virtually eliminate traffic fatalities. That is the dream scenario. But 170 00:11:07,080 --> 00:11:10,640 Speaker 1: there are a lot of things standing between us and 171 00:11:10,720 --> 00:11:14,240 Speaker 1: that vision. I will go into that after we take 172 00:11:14,440 --> 00:11:25,079 Speaker 1: this quick break. Okay. To get to that vision of 173 00:11:25,120 --> 00:11:29,000 Speaker 1: the future where autonomous vehicles are taking us everywhere we 174 00:11:29,040 --> 00:11:32,560 Speaker 1: need to go and traffic fatalities are a distant memory, 175 00:11:33,240 --> 00:11:35,840 Speaker 1: we have to get the tech right first, and we 176 00:11:35,920 --> 00:11:37,960 Speaker 1: also have to get enough vehicles out on the road 177 00:11:38,200 --> 00:11:40,960 Speaker 1: to make a difference. One autonomous vehicle isn't going to 178 00:11:41,080 --> 00:11:43,640 Speaker 1: change the world. And I've seen a lot of different 179 00:11:43,679 --> 00:11:46,400 Speaker 1: research about this as well, some researchers saying that we 180 00:11:46,480 --> 00:11:49,240 Speaker 1: might need to reach a saturation point of about one 181 00:11:49,320 --> 00:11:52,920 Speaker 1: third autonomous vehicles to two thirds human driven vehicles to 182 00:11:53,000 --> 00:11:57,560 Speaker 1: start seeing massive changes and things like accidents and traffic congestion. 183 00:11:57,920 --> 00:12:01,800 Speaker 1: And then, of course, beyond that, we'd see even more improvements, 184 00:12:01,880 --> 00:12:04,840 Speaker 1: not necessarily at the same dramatic pace, like it might 185 00:12:04,960 --> 00:12:08,000 Speaker 1: level off more where we get sort of a diminishing 186 00:12:08,000 --> 00:12:12,600 Speaker 1: return situation. But when you're talking about saving lives, diminishing 187 00:12:12,600 --> 00:12:16,040 Speaker 1: return doesn't really have the same meaning, right, Every life 188 00:12:16,080 --> 00:12:21,160 Speaker 1: saved is significant. Now, when it comes to autonomy, at 189 00:12:21,200 --> 00:12:26,160 Speaker 1: least for vehicles, there is a spectrum, you know. There. 190 00:12:26,200 --> 00:12:29,080 Speaker 1: Autonomy is more like guidelines, as the pirates would say. 191 00:12:29,600 --> 00:12:33,559 Speaker 1: The Society of Automotive Engineers, which is a quote globally 192 00:12:33,640 --> 00:12:38,760 Speaker 1: active professional association and standards developing organization end quote, has 193 00:12:38,880 --> 00:12:44,480 Speaker 1: identified six levels of autonomy, ranging from zero to five, 194 00:12:45,280 --> 00:12:48,240 Speaker 1: and it is ascending in autonomy. So let's go through 195 00:12:48,280 --> 00:12:53,800 Speaker 1: those now. At levels zero, you have no driving automation 196 00:12:54,280 --> 00:12:58,040 Speaker 1: really of any significance. I used to think that level 197 00:12:58,160 --> 00:13:02,720 Speaker 1: zero just meant that you had complete manual control of 198 00:13:02,760 --> 00:13:06,480 Speaker 1: the vehicle and there were no driver assist features at all. 199 00:13:07,600 --> 00:13:10,000 Speaker 1: That's not the case. That was my mistake. That was 200 00:13:10,000 --> 00:13:15,160 Speaker 1: my misunderstanding of this designation. Uh, you could have cruise 201 00:13:15,200 --> 00:13:19,800 Speaker 1: control and it could still fall into a level zero category. 202 00:13:20,840 --> 00:13:24,560 Speaker 1: So basic cruise control would not be enough to move 203 00:13:24,800 --> 00:13:27,560 Speaker 1: a vehicle out of level zero designation to level one 204 00:13:28,040 --> 00:13:31,720 Speaker 1: because basic cruise control will maintain a set speed, but 205 00:13:31,760 --> 00:13:35,880 Speaker 1: it won't change it. Right, if there is a vehicle 206 00:13:36,160 --> 00:13:40,680 Speaker 1: that's ahead of you, the standard cruise control isn't enough 207 00:13:40,720 --> 00:13:44,200 Speaker 1: to detect that vehicle and change the speed. That would 208 00:13:44,280 --> 00:13:48,560 Speaker 1: have to be adaptive cruise control. That's different basic cruise control. 209 00:13:48,640 --> 00:13:51,439 Speaker 1: You're still at level zero. Back in the day, I 210 00:13:51,520 --> 00:13:53,160 Speaker 1: used to think that that would at least push you 211 00:13:53,200 --> 00:13:55,800 Speaker 1: to level one. No, not, according to the s A 212 00:13:56,760 --> 00:13:59,880 Speaker 1: The s A E says that any driver assist features 213 00:14:00,080 --> 00:14:03,280 Speaker 1: are in a level zero car are limited to quote 214 00:14:03,400 --> 00:14:07,480 Speaker 1: providing warnings and momentary assistance end quote. So you can 215 00:14:07,520 --> 00:14:10,920 Speaker 1: have like a lane correct feature, but if it's just 216 00:14:11,080 --> 00:14:15,520 Speaker 1: momentarily shifting the car, that still is considered a level 217 00:14:15,600 --> 00:14:21,360 Speaker 1: zero autonomous vehicle. Automatic emergency braking, lane departure warnings, all 218 00:14:21,360 --> 00:14:24,160 Speaker 1: that kind of stuff I'll just count as minimal driver assist. 219 00:14:24,520 --> 00:14:27,600 Speaker 1: So any vehicle having those features but nothing more extensive 220 00:14:27,600 --> 00:14:31,160 Speaker 1: than that, would still be level zero. So that's a 221 00:14:31,240 --> 00:14:33,560 Speaker 1: lot of the vehicles that are on the road today, 222 00:14:34,280 --> 00:14:37,440 Speaker 1: the vast majority of them, in fact. But let's go 223 00:14:37,520 --> 00:14:39,720 Speaker 1: on to level one, because we do have level one 224 00:14:40,040 --> 00:14:43,840 Speaker 1: and even level two autonomous vehicles on the road. So 225 00:14:43,920 --> 00:14:47,080 Speaker 1: a level one autonomous vehicle has the ability to take 226 00:14:47,200 --> 00:14:51,680 Speaker 1: over some duty that the driver would normally do, but 227 00:14:51,800 --> 00:14:55,200 Speaker 1: only one of those duties, and the human driver would 228 00:14:55,200 --> 00:14:57,840 Speaker 1: have to handle everything else. So, in other words, the 229 00:14:57,880 --> 00:15:01,400 Speaker 1: feature might be able to handle something like acceleration and braking, 230 00:15:02,200 --> 00:15:04,960 Speaker 1: or it might be able to handle steering, but it 231 00:15:05,000 --> 00:15:07,960 Speaker 1: would not be able to handle both of those, So 232 00:15:08,240 --> 00:15:10,760 Speaker 1: whichever one it handles, the human driver would have to 233 00:15:11,240 --> 00:15:15,200 Speaker 1: deal with the other. One adaptive cruise control that I 234 00:15:15,240 --> 00:15:18,240 Speaker 1: mentioned earlier, in which a vehicle handles acceleration and it 235 00:15:18,240 --> 00:15:20,920 Speaker 1: will ease off if it detects that it's approaching the 236 00:15:21,000 --> 00:15:23,520 Speaker 1: rear of a vehicle in front of it. That's an 237 00:15:23,560 --> 00:15:27,280 Speaker 1: example of a level one autonomous vehicle feature. It's handling 238 00:15:27,280 --> 00:15:30,160 Speaker 1: acceleration and braking, but it's not handling steering. The human 239 00:15:30,240 --> 00:15:35,520 Speaker 1: driver still still has to steer the vehicle in that scenario. Now, 240 00:15:35,720 --> 00:15:39,400 Speaker 1: if we move up to level to autonomy, we're now 241 00:15:39,480 --> 00:15:42,240 Speaker 1: at a phase where the vehicle can handle both the 242 00:15:42,280 --> 00:15:46,600 Speaker 1: acceleration and breaking activities as well as steering the vehicle 243 00:15:46,800 --> 00:15:50,760 Speaker 1: at the same time. However, a human driver must supervise 244 00:15:51,000 --> 00:15:55,600 Speaker 1: these and also intervene when necessary, as the vehicle will 245 00:15:55,640 --> 00:15:59,560 Speaker 1: not always be capable of avoiding an accident when something 246 00:16:00,000 --> 00:16:06,480 Speaker 1: out of the ordinary happens, or even something remotely unexpected happens. 247 00:16:06,520 --> 00:16:09,000 Speaker 1: So this would be a vehicle that can do both 248 00:16:09,040 --> 00:16:14,480 Speaker 1: adaptive cruise control and lane centering operations at the same time. 249 00:16:15,200 --> 00:16:17,520 Speaker 1: And honestly, this is where we're at with most of 250 00:16:17,520 --> 00:16:20,960 Speaker 1: the really sophisticated passenger vehicles that are on the market 251 00:16:21,040 --> 00:16:25,640 Speaker 1: right now. Tesla, even with its so called full self 252 00:16:25,720 --> 00:16:30,280 Speaker 1: driving feature, which is not full self driving, and Tesla 253 00:16:30,360 --> 00:16:34,960 Speaker 1: darnwell knows it. I have a longstanding issue with Tesla 254 00:16:35,080 --> 00:16:39,760 Speaker 1: and it's naming conventions. Calling its first driver assist support 255 00:16:39,920 --> 00:16:44,960 Speaker 1: sweet as autopilot when it's not really autopilot really upset me. 256 00:16:45,280 --> 00:16:49,240 Speaker 1: Full self driving is even worse because it the name 257 00:16:49,600 --> 00:16:53,720 Speaker 1: implies that the car just completely drives itself. That is 258 00:16:53,760 --> 00:16:56,880 Speaker 1: not the case. It still requires human supervision, and Tesla 259 00:16:56,920 --> 00:16:59,280 Speaker 1: says this to drivers. It says that you still have 260 00:16:59,360 --> 00:17:02,800 Speaker 1: to supervised the vehicle, that you can't just take your 261 00:17:02,840 --> 00:17:05,239 Speaker 1: hands off and go to sleep behind the wheel. You 262 00:17:05,280 --> 00:17:09,160 Speaker 1: have to maintain supervision and be ready to intervene. But 263 00:17:09,359 --> 00:17:14,160 Speaker 1: yet they insist on calling their features full self driving. 264 00:17:14,560 --> 00:17:19,080 Speaker 1: I just feel like that's misleading. Anyway, that is still 265 00:17:19,119 --> 00:17:24,320 Speaker 1: a level to autonomous feature. It means Tesla is on 266 00:17:24,440 --> 00:17:27,640 Speaker 1: the level two on the autonomous scale. It hasn't hit 267 00:17:27,720 --> 00:17:31,280 Speaker 1: level three yet. And from levels zero to two, the 268 00:17:31,359 --> 00:17:34,879 Speaker 1: job of monitoring the driving environment falls on the human 269 00:17:34,880 --> 00:17:39,119 Speaker 1: in the car. They are responsible for making sure that 270 00:17:39,160 --> 00:17:43,359 Speaker 1: the environment is safe. Uh that for at least whatever 271 00:17:43,440 --> 00:17:47,000 Speaker 1: operation the car is in at that time, and you 272 00:17:47,160 --> 00:17:49,879 Speaker 1: cannot offload that to the vehicle. The vehicle is not 273 00:17:50,080 --> 00:17:54,800 Speaker 1: capable of shouldering that accountability. So ultimately the person behind 274 00:17:54,840 --> 00:17:57,680 Speaker 1: the wheel is still responsible for their own safety and 275 00:17:57,760 --> 00:18:01,760 Speaker 1: for avoiding accidents. But now let's talk about levels three 276 00:18:01,800 --> 00:18:05,879 Speaker 1: through five, in which the vehicle's automated system is in 277 00:18:06,000 --> 00:18:09,760 Speaker 1: charge of monitoring the driving environment. So in this case, 278 00:18:09,800 --> 00:18:13,440 Speaker 1: the car is responsible or the vehicle I should say 279 00:18:13,440 --> 00:18:17,479 Speaker 1: not just car is responsible for monitoring what's going on 280 00:18:17,520 --> 00:18:22,639 Speaker 1: around it and then using that information to make important decisions. 281 00:18:23,320 --> 00:18:27,600 Speaker 1: So level three is called conditional automation, and that kind 282 00:18:27,640 --> 00:18:31,000 Speaker 1: of clues you into what's going on. So the vehicle 283 00:18:31,320 --> 00:18:36,480 Speaker 1: is capable of operating autonomously under certain conditions, but not 284 00:18:36,840 --> 00:18:41,960 Speaker 1: all conditions. So unless all autonomous conditions are met, the 285 00:18:42,040 --> 00:18:45,800 Speaker 1: vehicle will not operate an autonomous mode. A level three 286 00:18:45,800 --> 00:18:49,440 Speaker 1: autonomous vehicle will be able to make certain environmental decisions, 287 00:18:49,520 --> 00:18:53,800 Speaker 1: such as accelerating past a slower moving vehicle or maneuvering 288 00:18:53,840 --> 00:18:57,000 Speaker 1: through a traffic jam, which typically involves lots of starts 289 00:18:57,000 --> 00:19:01,119 Speaker 1: and stops and potentially lane changes. There are a couple 290 00:19:01,160 --> 00:19:05,439 Speaker 1: of vehicles that have received international acknowledgement as attaining level 291 00:19:05,640 --> 00:19:10,000 Speaker 1: three autonomy. There's a car called the Honda Legend, which 292 00:19:10,040 --> 00:19:13,119 Speaker 1: is only available as a least vehicle in Japan, so 293 00:19:13,240 --> 00:19:16,920 Speaker 1: unless you're in Japan, you're not likely to see one 294 00:19:17,080 --> 00:19:22,000 Speaker 1: or experience it. Mercedes Benz has received a regulatory approval 295 00:19:22,080 --> 00:19:25,920 Speaker 1: to manufacture level three autonomous vehicles that have its drive 296 00:19:26,040 --> 00:19:30,280 Speaker 1: pilot system installed. It's recently been updated and has met 297 00:19:30,320 --> 00:19:35,800 Speaker 1: international standards to be considered a level three autonomous system. 298 00:19:35,840 --> 00:19:39,840 Speaker 1: So with a level three vehicle, human intervention isn't needed 299 00:19:39,920 --> 00:19:43,680 Speaker 1: for most situations. It may still be required in edge 300 00:19:43,760 --> 00:19:48,719 Speaker 1: cases where things are unusual. Now, at level four autonomy, 301 00:19:48,960 --> 00:19:52,359 Speaker 1: an autonomous vehicle would in theory be able to react 302 00:19:52,400 --> 00:19:56,320 Speaker 1: to most circumstances without the need for human intervention. It 303 00:19:56,359 --> 00:20:00,720 Speaker 1: would not ask for human intervention, it would just react. 304 00:20:01,240 --> 00:20:04,720 Speaker 1: The vehicle would respond properly, able to discern what was 305 00:20:04,800 --> 00:20:08,920 Speaker 1: happening and then take the appropriate response. For example, such 306 00:20:08,920 --> 00:20:11,960 Speaker 1: a car would presumably be able to distinguish between a 307 00:20:12,080 --> 00:20:16,040 Speaker 1: trail of gosling's attempting to cross the road versus some 308 00:20:16,240 --> 00:20:19,320 Speaker 1: leaves being blown by the wind, and either it would 309 00:20:19,320 --> 00:20:22,560 Speaker 1: slow down to let the little goslings get across, or 310 00:20:22,600 --> 00:20:25,560 Speaker 1: it would just continue without worry because it's just some leaves. 311 00:20:26,080 --> 00:20:29,560 Speaker 1: And yeah, just saying that filled me with anxiety because 312 00:20:29,560 --> 00:20:31,840 Speaker 1: I think goslings are cute and the thought of a 313 00:20:31,840 --> 00:20:39,240 Speaker 1: car just driving through a crossing animal really really upsets 314 00:20:39,240 --> 00:20:43,200 Speaker 1: me on a deep level. However, a level for autonomous 315 00:20:43,280 --> 00:20:47,920 Speaker 1: vehicle would still be a conditionally autonomous vehicle, so there 316 00:20:47,920 --> 00:20:50,680 Speaker 1: would still be some conditions under which this vehicle would 317 00:20:50,680 --> 00:20:54,480 Speaker 1: not operate autonomously. And because this is the level that 318 00:20:54,520 --> 00:20:58,080 Speaker 1: the s a E considers to be appropriate for robo taxis, 319 00:20:58,520 --> 00:21:00,840 Speaker 1: it would mean that under a certain con asitions you 320 00:21:00,840 --> 00:21:03,920 Speaker 1: wouldn't be able to get a ride in a robotaxi. 321 00:21:04,000 --> 00:21:08,560 Speaker 1: Because if if conditions are wrong for autonomous operation, then 322 00:21:08,600 --> 00:21:11,679 Speaker 1: they won't they won't pick you up, right, They won't work. 323 00:21:12,320 --> 00:21:15,840 Speaker 1: Those conditions could include things like weather events like let's 324 00:21:15,880 --> 00:21:20,639 Speaker 1: say it's really pouring down rain and because of the 325 00:21:20,760 --> 00:21:25,280 Speaker 1: heavy precipitation, there's a fear that the vehicle's sensors won't 326 00:21:25,320 --> 00:21:31,000 Speaker 1: operate properly so it's unavailable. Or it could involve things 327 00:21:31,080 --> 00:21:34,399 Speaker 1: like geo fencing. That is, the vehicles can only operate 328 00:21:34,480 --> 00:21:38,240 Speaker 1: within a certain geographic area. Typically we're talking about a 329 00:21:38,280 --> 00:21:43,440 Speaker 1: specific town or specific city, or specific lane between two cities, 330 00:21:43,960 --> 00:21:47,000 Speaker 1: and the cars systems will not allow the vehicle to 331 00:21:47,080 --> 00:21:51,240 Speaker 1: venture beyond certain borders. In fact, we're gonna talk about 332 00:21:51,240 --> 00:21:53,760 Speaker 1: how geo fencing is one way that companies are looking 333 00:21:53,760 --> 00:21:57,720 Speaker 1: to tackle very tough engineering challenges in an effort to 334 00:21:57,720 --> 00:22:02,200 Speaker 1: bring robotaxis to various places. Also, at this level, things 335 00:22:02,240 --> 00:22:06,240 Speaker 1: like a steering wheel and or pedals like an acceleration 336 00:22:06,280 --> 00:22:09,359 Speaker 1: and brake pedal. Those might not even be installed in 337 00:22:09,400 --> 00:22:12,520 Speaker 1: a level for autonomous vehicle, so there would be no 338 00:22:12,600 --> 00:22:15,320 Speaker 1: controls for you to take over should you feel the 339 00:22:15,359 --> 00:22:17,560 Speaker 1: need to intervene. You would not have that option. You 340 00:22:17,640 --> 00:22:20,560 Speaker 1: might have a button to press that was like an 341 00:22:20,560 --> 00:22:23,400 Speaker 1: emergency stop feature where the car would then pull off 342 00:22:23,400 --> 00:22:26,080 Speaker 1: to the side and come to a halt, but that 343 00:22:26,200 --> 00:22:29,400 Speaker 1: might be it. So a level for autonomous vehicle should 344 00:22:29,400 --> 00:22:32,600 Speaker 1: be able to handle pretty much any situation that could 345 00:22:32,640 --> 00:22:37,879 Speaker 1: pop up under its operational conditions. And finally, we have 346 00:22:38,040 --> 00:22:41,639 Speaker 1: level five autonomy. Now at this level of vehicle under 347 00:22:41,680 --> 00:22:45,600 Speaker 1: any conditions would be able to operate autonomously. So this 348 00:22:45,720 --> 00:22:48,760 Speaker 1: is the level where a vehicle could go anywhere and 349 00:22:48,800 --> 00:22:52,640 Speaker 1: operate safely no matter what conditions might be present. Such 350 00:22:52,680 --> 00:22:55,680 Speaker 1: a vehicle might have no other controls in it and 351 00:22:55,760 --> 00:22:59,840 Speaker 1: handle absolutely everything by itself. We are nowhere close to 352 00:23:00,080 --> 00:23:05,080 Speaker 1: achieving level five autonomy right now. So when we come back, 353 00:23:05,240 --> 00:23:07,119 Speaker 1: we'll talk about some of the things that are holding 354 00:23:07,200 --> 00:23:10,479 Speaker 1: us back from full autonomy, and some of them are 355 00:23:10,480 --> 00:23:14,520 Speaker 1: technical and some of them are social and political. But 356 00:23:14,600 --> 00:23:26,840 Speaker 1: first let's take a quick break. Okay, let's talk limitations, 357 00:23:26,880 --> 00:23:30,439 Speaker 1: and we'll start with the technical. We've seen some pretty 358 00:23:30,560 --> 00:23:34,919 Speaker 1: dramatic improvements in sensor technology over the years, but the 359 00:23:34,960 --> 00:23:38,240 Speaker 1: fact is that under some conditions sensors can have trouble 360 00:23:38,280 --> 00:23:42,480 Speaker 1: detecting the environment effectively, particularly when you're talking about the 361 00:23:42,560 --> 00:23:46,359 Speaker 1: sensors being mounted to a platform that's moving at several 362 00:23:46,520 --> 00:23:50,119 Speaker 1: dozen miles per hour or kilometers per hour. So this 363 00:23:50,200 --> 00:23:54,359 Speaker 1: can include conditions like dense fog or heavy precipitation. Like 364 00:23:54,400 --> 00:23:57,879 Speaker 1: I mentioned earlier, it would be pretty dangerous to ride 365 00:23:57,960 --> 00:24:02,080 Speaker 1: in an autonomous vehicle that ends up mistaking fog for 366 00:24:02,200 --> 00:24:06,199 Speaker 1: a solid surface. For example, let's say that whatever technology 367 00:24:06,280 --> 00:24:09,520 Speaker 1: the sensors are using, the signals are bouncing back because 368 00:24:09,520 --> 00:24:13,159 Speaker 1: it's hitting the fog or the precipitation. A car like 369 00:24:13,200 --> 00:24:17,240 Speaker 1: that might start breaking suddenly, like applying the brakes suddenly, 370 00:24:17,400 --> 00:24:19,600 Speaker 1: and that could pose a hazard for vehicles that are 371 00:24:19,600 --> 00:24:23,840 Speaker 1: behind you. And since their limitations aren't a hypothetical either, 372 00:24:24,000 --> 00:24:28,399 Speaker 1: we've seen some tragic cases in which driver assist systems 373 00:24:28,600 --> 00:24:32,439 Speaker 1: failed to detect a danger and that led to loss 374 00:24:32,440 --> 00:24:38,120 Speaker 1: of life fatalities. I mean back in famously, a Tesler 375 00:24:38,280 --> 00:24:42,840 Speaker 1: driver named Joshua Brown was on a Florida divided highway. 376 00:24:43,000 --> 00:24:45,520 Speaker 1: So that's a highway that has a couple of lanes 377 00:24:45,520 --> 00:24:48,440 Speaker 1: going in one direction. Then you typically have a divider, 378 00:24:48,520 --> 00:24:51,800 Speaker 1: a median of some sort, and lanes going in another direction. 379 00:24:52,480 --> 00:24:56,400 Speaker 1: Uh and back in ten, Joshua Brown's on a Florida 380 00:24:56,480 --> 00:25:01,920 Speaker 1: divided highway. He had autopilot in his test engaged and 381 00:25:02,080 --> 00:25:04,480 Speaker 1: as he was traveling down there was a semi truck 382 00:25:04,920 --> 00:25:08,080 Speaker 1: that was pulling out of a driveway that was perpendicular 383 00:25:08,119 --> 00:25:12,679 Speaker 1: to the highway. The semi truck was crossing Brown's lanes 384 00:25:12,720 --> 00:25:15,760 Speaker 1: of traffic in an effort to make a left hand 385 00:25:15,840 --> 00:25:19,920 Speaker 1: turn to go the opposite direction down the highway, so 386 00:25:20,280 --> 00:25:24,679 Speaker 1: it was pulling across the lanes that Brown's vehicle was in, 387 00:25:26,040 --> 00:25:31,000 Speaker 1: uh the Tesla. Brown's Tesla failed to detect that there 388 00:25:31,119 --> 00:25:33,920 Speaker 1: was an obstacle in the way, so the Tesla did 389 00:25:33,960 --> 00:25:36,879 Speaker 1: not slow down and ended up colliding with the truck 390 00:25:36,960 --> 00:25:40,800 Speaker 1: and Brown died in that car accident. Now, back then, 391 00:25:41,359 --> 00:25:43,960 Speaker 1: Tesla was relying on systems that were provided by a 392 00:25:44,000 --> 00:25:49,320 Speaker 1: company called Mobile I, but Tesla subsequently ended that partnership 393 00:25:49,760 --> 00:25:53,959 Speaker 1: and then began to develop new sensor technology in house. 394 00:25:54,480 --> 00:25:59,280 Speaker 1: So the autopilot of today's Tesla's works on a totally 395 00:25:59,320 --> 00:26:02,960 Speaker 1: different techno logical platform then the one back in two 396 00:26:02,960 --> 00:26:07,000 Speaker 1: thousand sixteen. Because Tesla recognize that this is not something 397 00:26:07,000 --> 00:26:10,760 Speaker 1: that it can it can support and still expect to 398 00:26:10,760 --> 00:26:14,040 Speaker 1: be able to sell vehicles that have this autopilot feature 399 00:26:14,040 --> 00:26:17,359 Speaker 1: in them. Now that being said, we saw a near 400 00:26:17,520 --> 00:26:22,480 Speaker 1: identical tragedy occur in two thousand nineteen when Jeremy Banner, 401 00:26:22,880 --> 00:26:27,479 Speaker 1: also going down a divided highway, also in Florida, also 402 00:26:27,600 --> 00:26:31,800 Speaker 1: driving a Tesla that was in autopilot mode, also collided 403 00:26:31,840 --> 00:26:33,840 Speaker 1: with a semi truck that had pulled out to make 404 00:26:33,880 --> 00:26:37,720 Speaker 1: a left hand turn. It was, in many ways an 405 00:26:37,760 --> 00:26:42,400 Speaker 1: identical scenario to what we saw before. So that leads 406 00:26:42,400 --> 00:26:47,240 Speaker 1: to a question, how could two different autopilot systems make 407 00:26:47,400 --> 00:26:51,000 Speaker 1: the same fatal error? And a lot of this potentially 408 00:26:51,000 --> 00:26:53,359 Speaker 1: has to do with the types of technologies we depend 409 00:26:53,440 --> 00:26:56,600 Speaker 1: upon to do certain things. So a lot of the 410 00:26:56,680 --> 00:27:00,480 Speaker 1: anti collision technology we find in vehicles real lies on 411 00:27:00,640 --> 00:27:03,840 Speaker 1: Doppler radar. Let's talk about what that means for just 412 00:27:03,920 --> 00:27:07,679 Speaker 1: a second. First radar, Now, that involves sending out a 413 00:27:07,760 --> 00:27:11,439 Speaker 1: radio signal and then detecting the echoes of that signal 414 00:27:11,600 --> 00:27:15,320 Speaker 1: as they bounce back to the source. So your radar 415 00:27:15,640 --> 00:27:18,240 Speaker 1: has an emitter and it has a sensor. The admitter 416 00:27:18,280 --> 00:27:21,760 Speaker 1: sends out a radio signal, the sensor detects the returning 417 00:27:21,840 --> 00:27:24,560 Speaker 1: radio signal after it's bounced off something. And when you 418 00:27:24,600 --> 00:27:27,119 Speaker 1: measure the amount of time it took for a signal 419 00:27:27,160 --> 00:27:30,080 Speaker 1: to go out and then bounce back to you, that 420 00:27:30,200 --> 00:27:32,800 Speaker 1: tells you how far away you are from whatever it 421 00:27:32,920 --> 00:27:36,480 Speaker 1: is that you're beaming signals at, which is pretty simple, right, 422 00:27:37,480 --> 00:27:39,960 Speaker 1: But let's talk about the Doppler part. Now, if you've 423 00:27:40,000 --> 00:27:43,000 Speaker 1: got two objects that are standing still, well, that means 424 00:27:43,040 --> 00:27:45,200 Speaker 1: the signal you send out and the signal you get 425 00:27:45,200 --> 00:27:47,480 Speaker 1: back are going to be pretty much the same wavelength 426 00:27:47,480 --> 00:27:51,280 Speaker 1: and frequency because both you and the object are at 427 00:27:51,280 --> 00:27:55,040 Speaker 1: a standstill. But what if one of you is moving 428 00:27:55,080 --> 00:27:58,280 Speaker 1: toward the other, Well, in that case, the signal that 429 00:27:58,320 --> 00:28:01,280 Speaker 1: comes back to you is going to be compressed. It's 430 00:28:01,280 --> 00:28:05,120 Speaker 1: going to be shorter in wavelength and higher in frequency 431 00:28:05,160 --> 00:28:09,360 Speaker 1: than the signal you sent out. It's being pushed by 432 00:28:09,400 --> 00:28:11,679 Speaker 1: whatever it is that's moving towards you, or, in the 433 00:28:11,720 --> 00:28:13,680 Speaker 1: case of you moving toward it, the fact that you're 434 00:28:13,720 --> 00:28:17,080 Speaker 1: getting closer. So this tells you not only are you 435 00:28:17,160 --> 00:28:20,600 Speaker 1: within a certain distance of another object, it tells you 436 00:28:20,640 --> 00:28:23,639 Speaker 1: that you are getting closer to that object or that 437 00:28:23,680 --> 00:28:25,640 Speaker 1: object is getting closer to you, depending on your point 438 00:28:25,640 --> 00:28:29,560 Speaker 1: of reference. Now, for those of us who are are 439 00:28:29,680 --> 00:28:32,760 Speaker 1: able to hear, this is something that we can experience 440 00:28:32,800 --> 00:28:36,120 Speaker 1: out in the real world without any radar at all. 441 00:28:36,440 --> 00:28:39,640 Speaker 1: If you've ever heard and approaching emergency vehicle that's running 442 00:28:39,680 --> 00:28:43,240 Speaker 1: its siren. You probably noticed the sound of the siren 443 00:28:43,360 --> 00:28:47,480 Speaker 1: changes when the vehicle passes you, So as the vehicle 444 00:28:47,560 --> 00:28:52,120 Speaker 1: is approaching you, the pitch of the sirens sound is higher, 445 00:28:52,640 --> 00:28:55,640 Speaker 1: and as it moves away from you, the pitch goes lower. 446 00:28:56,000 --> 00:28:59,880 Speaker 1: And that's because the actual sound wave is being compressed 447 00:29:00,160 --> 00:29:04,120 Speaker 1: as the vehicles moving towards you, and it's expanding as 448 00:29:04,120 --> 00:29:07,880 Speaker 1: it's moving away from you. The vehicle pushes those sound 449 00:29:07,880 --> 00:29:10,880 Speaker 1: waves closer together as it's coming at you and stretches 450 00:29:10,920 --> 00:29:13,880 Speaker 1: amount as it's going away. That's when why the pitch 451 00:29:13,920 --> 00:29:16,800 Speaker 1: is higher one way and lower the other way. The 452 00:29:16,880 --> 00:29:19,960 Speaker 1: same thing happens with stuff like radar and radio signals, 453 00:29:20,120 --> 00:29:24,120 Speaker 1: and obviously it gets more complicated if both you and 454 00:29:24,200 --> 00:29:26,880 Speaker 1: the other objects are in motion at the same time. 455 00:29:27,360 --> 00:29:30,200 Speaker 1: But these systems work best when everyone is moving in 456 00:29:30,240 --> 00:29:33,400 Speaker 1: the same direction of travel, right, So it works really 457 00:29:33,400 --> 00:29:37,680 Speaker 1: well if you're behind another vehicle and that vehicle slows down, 458 00:29:38,160 --> 00:29:41,040 Speaker 1: then the system is very good at detecting that. But 459 00:29:41,160 --> 00:29:43,800 Speaker 1: in a case where a truck is pulling across lanes 460 00:29:43,840 --> 00:29:47,960 Speaker 1: of traffic, where the truck is moving perpendicular to your 461 00:29:48,000 --> 00:29:51,680 Speaker 1: direction of travel, it doesn't work as well. It doesn't 462 00:29:51,720 --> 00:29:55,080 Speaker 1: work as well on stationary objects, and a truck would 463 00:29:55,120 --> 00:29:58,200 Speaker 1: appear to be stationary as it was moving laterally across 464 00:29:58,360 --> 00:30:01,920 Speaker 1: your lane of traffic. Now, depending upon the lighting and 465 00:30:01,920 --> 00:30:05,600 Speaker 1: the time of day and autonomous vehicles, camera system might 466 00:30:05,680 --> 00:30:09,160 Speaker 1: not do much good either. It might misidentify a truck 467 00:30:09,400 --> 00:30:12,200 Speaker 1: as being like a road sign, and a road sign 468 00:30:12,440 --> 00:30:14,320 Speaker 1: hangs over the highway, it's not in the way of 469 00:30:14,320 --> 00:30:16,760 Speaker 1: the highway, so the system wouldn't think of it as 470 00:30:16,800 --> 00:30:20,120 Speaker 1: being a potential threat. Or it might interpret the side 471 00:30:20,120 --> 00:30:22,800 Speaker 1: of the truck to be the sky, depending on those 472 00:30:22,880 --> 00:30:25,760 Speaker 1: lighting conditions, and that would mean that the vehicle safety 473 00:30:25,760 --> 00:30:28,920 Speaker 1: features would not initiate, and that could lead to tragedies 474 00:30:28,960 --> 00:30:33,080 Speaker 1: like we've mentioned here on this episode. Now, I don't 475 00:30:33,120 --> 00:30:37,520 Speaker 1: mean to suggest that these technical problems are impossible to solve. 476 00:30:38,000 --> 00:30:41,000 Speaker 1: I don't think that's true, but I do think it's 477 00:30:41,000 --> 00:30:43,080 Speaker 1: going to take a lot of work, and it might 478 00:30:43,080 --> 00:30:48,400 Speaker 1: require technologies that for the moment are prohibitively expensive, meaning 479 00:30:48,440 --> 00:30:53,400 Speaker 1: that yeah, we could create a more reliable autonomous system, 480 00:30:53,400 --> 00:30:56,600 Speaker 1: but then the cost of the vehicle would be so 481 00:30:56,760 --> 00:31:00,640 Speaker 1: high that it wouldn't be practical from an economic standpoint. 482 00:31:01,480 --> 00:31:05,240 Speaker 1: Some of these systems. These components like LDAR, some of 483 00:31:05,280 --> 00:31:07,800 Speaker 1: them can cost as much as a brand new car 484 00:31:07,920 --> 00:31:10,440 Speaker 1: does all by itself. And I'm not talking about the 485 00:31:10,520 --> 00:31:13,600 Speaker 1: cheap car either. So when you start looking at that, 486 00:31:14,240 --> 00:31:17,480 Speaker 1: you have to then take the economic factors into consideration 487 00:31:17,520 --> 00:31:23,320 Speaker 1: and say, doesn't make economic sense to develop a system 488 00:31:23,360 --> 00:31:27,800 Speaker 1: that relies on these components, because will we ever make 489 00:31:27,840 --> 00:31:30,360 Speaker 1: our money back? Like would you be able to sell 490 00:31:30,440 --> 00:31:33,600 Speaker 1: the vehicle at a cost that would make sense, or 491 00:31:33,600 --> 00:31:35,280 Speaker 1: would no one buy it because it would be far 492 00:31:35,320 --> 00:31:39,120 Speaker 1: too expensive. Even if you're talking about, say a taxi service, 493 00:31:39,680 --> 00:31:43,240 Speaker 1: maybe a taxi service wouldn't buy it either because they'd say, 494 00:31:43,280 --> 00:31:46,280 Speaker 1: the likelihood of us making our money back with cars 495 00:31:46,280 --> 00:31:49,080 Speaker 1: that are that expensive before we have to replace those 496 00:31:49,120 --> 00:31:52,280 Speaker 1: cars is so low that it doesn't make sense to 497 00:31:52,280 --> 00:31:55,440 Speaker 1: go into it. That could be a real obstacle, and 498 00:31:55,520 --> 00:31:58,080 Speaker 1: as you know, more to do with the cost than 499 00:31:58,120 --> 00:32:02,920 Speaker 1: the actual technology. I also think that a major problem 500 00:32:02,960 --> 00:32:06,280 Speaker 1: with autonomous vehicles has been in messaging and in hype 501 00:32:06,840 --> 00:32:10,520 Speaker 1: companies in the autonomous vehicle business, not just Tesla I. 502 00:32:10,720 --> 00:32:13,280 Speaker 1: I heap a lot of abuse on Tesla for this, 503 00:32:13,760 --> 00:32:16,360 Speaker 1: but it is by far not the only company to 504 00:32:16,480 --> 00:32:18,840 Speaker 1: engage in these kinds of things. But I feel like 505 00:32:19,240 --> 00:32:21,600 Speaker 1: a lot of the leaders in that space and made 506 00:32:21,640 --> 00:32:25,640 Speaker 1: some really aggressive promises that in hindsight were impossible to 507 00:32:25,680 --> 00:32:29,040 Speaker 1: follow through on. At least they were impossible given the 508 00:32:29,080 --> 00:32:31,760 Speaker 1: time frames that were being tossed around. There were folks 509 00:32:31,960 --> 00:32:35,520 Speaker 1: suggesting that we would have cities filled with autonomous vehicles 510 00:32:35,560 --> 00:32:40,880 Speaker 1: able to operate in all conditions by well, it's two 511 00:32:40,960 --> 00:32:44,520 Speaker 1: and obviously that didn't happen. And no, it wasn't just 512 00:32:44,680 --> 00:32:48,160 Speaker 1: because a pandemic really messed things up. Though that definitely 513 00:32:48,200 --> 00:32:52,480 Speaker 1: didn't help because it disrupted everything from workflow to supply chains. 514 00:32:52,520 --> 00:32:57,440 Speaker 1: But even with sophisticated sensors, you then have to consider 515 00:32:57,520 --> 00:33:01,720 Speaker 1: the decision making factor and take add into account. Now, 516 00:33:01,760 --> 00:33:05,720 Speaker 1: at the moment, autonomous vehicles are for the most part 517 00:33:06,040 --> 00:33:11,000 Speaker 1: independent computing islands. It's like a personal computer that isn't 518 00:33:11,040 --> 00:33:14,240 Speaker 1: connected to the internet. So all the technology used to 519 00:33:14,320 --> 00:33:18,200 Speaker 1: keep the car in autonomous operation typically is just on 520 00:33:18,240 --> 00:33:21,200 Speaker 1: the car itself. So the car is relying on its 521 00:33:21,200 --> 00:33:25,120 Speaker 1: own sensors and its own processor to detect the environment 522 00:33:25,400 --> 00:33:29,560 Speaker 1: and make decisions. That is a tremendous amount of responsibility 523 00:33:29,680 --> 00:33:32,720 Speaker 1: to put on technology. Then again, it's also a tremendous 524 00:33:32,760 --> 00:33:35,240 Speaker 1: amount of responsibility to put on a teenager too, but 525 00:33:35,800 --> 00:33:40,240 Speaker 1: that's a different discussion. There are some proposals that suggest 526 00:33:40,280 --> 00:33:43,760 Speaker 1: the real future for autonomous vehicles is one in which 527 00:33:43,840 --> 00:33:48,400 Speaker 1: vehicles are in constant communication with one another and potentially 528 00:33:49,000 --> 00:33:53,120 Speaker 1: with the infrastructure itself, with the road system that a 529 00:33:53,200 --> 00:33:55,560 Speaker 1: vehicle would be able to stay in communication with, like 530 00:33:55,760 --> 00:33:59,200 Speaker 1: a smart city, The road system and the cars on 531 00:33:59,280 --> 00:34:03,200 Speaker 1: it would both be capable of adapting to different situations. So, 532 00:34:03,520 --> 00:34:07,600 Speaker 1: as an example, a smart city might have intersections that 533 00:34:07,640 --> 00:34:12,239 Speaker 1: could proactively change the timing on traffic lights to accommodate 534 00:34:12,360 --> 00:34:16,160 Speaker 1: changes in traffic patterns and smooth out the experience and 535 00:34:16,239 --> 00:34:20,880 Speaker 1: avoid long traffic jams. Cars in communication with the smart 536 00:34:20,920 --> 00:34:24,480 Speaker 1: city infrastructure could plot out the most efficient routes that 537 00:34:24,520 --> 00:34:28,080 Speaker 1: will save time and energy and adapt to changes that 538 00:34:28,120 --> 00:34:30,520 Speaker 1: happened on the fly. It might even be able to 539 00:34:30,680 --> 00:34:35,319 Speaker 1: adapt to routes that cause less wear and tear on 540 00:34:35,360 --> 00:34:39,440 Speaker 1: the actual infrastructure, giving city planners the chance to address 541 00:34:39,480 --> 00:34:43,480 Speaker 1: things before they become really big problems. I say that 542 00:34:43,520 --> 00:34:47,040 Speaker 1: as someone who lives in Atlanta and is very used 543 00:34:47,080 --> 00:34:50,120 Speaker 1: to the site of giant metal plates laid across the 544 00:34:50,239 --> 00:34:53,480 Speaker 1: road because potholes have developed and the city has not 545 00:34:53,600 --> 00:34:57,040 Speaker 1: had time to address it. If you had really smart infrastructure, 546 00:34:57,040 --> 00:34:59,799 Speaker 1: you could start to detect things before they got to 547 00:34:59,840 --> 00:35:03,880 Speaker 1: that point of of a problem. So these are really 548 00:35:03,920 --> 00:35:07,440 Speaker 1: cool ideas, and the improvements in quality of life in 549 00:35:07,520 --> 00:35:11,919 Speaker 1: those kinds of cities would be enormous. But to make 550 00:35:12,000 --> 00:35:14,600 Speaker 1: this happen a lot of other stuff has to fall 551 00:35:14,640 --> 00:35:18,560 Speaker 1: into place. First. Vehicles and cities would need to agree 552 00:35:18,600 --> 00:35:22,200 Speaker 1: upon a common set of languages and protocols in order 553 00:35:22,239 --> 00:35:27,280 Speaker 1: to communicate effectively. That alone is going to require tons 554 00:35:27,280 --> 00:35:30,919 Speaker 1: of work. Typically, we see car companies develop their own 555 00:35:31,080 --> 00:35:35,160 Speaker 1: in house systems which may or may not be compatible 556 00:35:35,360 --> 00:35:38,040 Speaker 1: with the systems that are used by other car companies, 557 00:35:38,440 --> 00:35:41,120 Speaker 1: which could lead to real delays. And seeing a fleet 558 00:35:41,200 --> 00:35:44,960 Speaker 1: of truly autonomous vehicles navigate the roads, it would be 559 00:35:45,040 --> 00:35:47,239 Speaker 1: kind of like putting a bunch of people who don't 560 00:35:47,280 --> 00:35:50,839 Speaker 1: speak the same language into a maze and have them 561 00:35:50,840 --> 00:35:54,600 Speaker 1: try and navigate it together. The cars would only understand 562 00:35:54,680 --> 00:35:58,239 Speaker 1: other vehicles that were from the same manufacturer, and that 563 00:35:58,239 --> 00:36:01,200 Speaker 1: would have limited usefulness unless we all just migrated to 564 00:36:01,280 --> 00:36:04,120 Speaker 1: the same vehicle manufacturer. But that would turn that company 565 00:36:04,120 --> 00:36:08,200 Speaker 1: into a global monopoly and uh I humbly suggest we 566 00:36:08,280 --> 00:36:12,719 Speaker 1: don't do that. But that future in which vehicles talk 567 00:36:12,800 --> 00:36:15,360 Speaker 1: to each other as well as two road systems into 568 00:36:15,360 --> 00:36:19,800 Speaker 1: cities will require tons of other work as well. Obviously, 569 00:36:19,880 --> 00:36:22,600 Speaker 1: cities are going to have to build out smart infrastructure, 570 00:36:22,680 --> 00:36:26,319 Speaker 1: and that's likely to be expensive and time consuming. We 571 00:36:26,360 --> 00:36:30,160 Speaker 1: also need really good wireless technologies in place to five 572 00:36:30,280 --> 00:36:33,200 Speaker 1: G has promise, but we would need it to be 573 00:36:33,320 --> 00:36:37,560 Speaker 1: more dense and more widespread than it is currently. For 574 00:36:37,640 --> 00:36:41,600 Speaker 1: some industries like trucking, five G is not a great 575 00:36:41,640 --> 00:36:46,080 Speaker 1: solution because on long hauls between cities, you're gonna hit 576 00:36:46,160 --> 00:36:48,239 Speaker 1: routes that are going to have little to know five 577 00:36:48,320 --> 00:36:51,920 Speaker 1: G support along those routes, and it would be foolish 578 00:36:52,120 --> 00:36:54,520 Speaker 1: to build out a business that depends upon five G 579 00:36:54,640 --> 00:37:00,279 Speaker 1: connectivity if that business involves navigating through places that don't 580 00:37:00,280 --> 00:37:03,239 Speaker 1: have five G support. Meanwhile, there has to be a 581 00:37:03,239 --> 00:37:07,120 Speaker 1: strong enough business case for the telecommunications industry to build 582 00:37:07,120 --> 00:37:11,239 Speaker 1: out five G availability to these regions. That's hard to 583 00:37:11,280 --> 00:37:14,400 Speaker 1: imagine if there just aren't that many people living along 584 00:37:14,400 --> 00:37:18,000 Speaker 1: those routes, right, if it's not densely populated, then the 585 00:37:18,000 --> 00:37:22,880 Speaker 1: telecommunications companies are gonna say, well, it's really expensive to 586 00:37:23,000 --> 00:37:25,839 Speaker 1: build out the infrastructure. We have to put antennas all 587 00:37:25,840 --> 00:37:29,040 Speaker 1: over the place. If we want to have high frequency 588 00:37:29,080 --> 00:37:32,760 Speaker 1: five G, the stuff that's really high through put, low latency, 589 00:37:32,960 --> 00:37:34,839 Speaker 1: that stuff, the five G that we think of when 590 00:37:34,840 --> 00:37:38,600 Speaker 1: we think, oh, this can replace fiber Internet. You have 591 00:37:38,680 --> 00:37:43,560 Speaker 1: to have very dense antenna deployment for that to work. Well, 592 00:37:43,560 --> 00:37:47,560 Speaker 1: that's expensive and if a place is not densely populated, 593 00:37:48,080 --> 00:37:50,760 Speaker 1: it's a really tough thing to tell a telecommunications company 594 00:37:50,800 --> 00:37:54,000 Speaker 1: you need to put out the expense to build out 595 00:37:54,000 --> 00:37:58,640 Speaker 1: that infrastructure even if you only have a few thousand 596 00:37:58,680 --> 00:38:02,480 Speaker 1: customers in these area, is right, Like, that's that's a 597 00:38:02,480 --> 00:38:06,359 Speaker 1: tough sales pitch to make to these telecommunications companies. So 598 00:38:06,400 --> 00:38:10,840 Speaker 1: it's not likely that we're going to see intense five 599 00:38:10,920 --> 00:38:14,680 Speaker 1: G coverage along some of these places. So five G 600 00:38:14,880 --> 00:38:17,520 Speaker 1: could work in cities, and it might be a really 601 00:38:17,560 --> 00:38:20,640 Speaker 1: good solution for the future of smart cities, but that 602 00:38:20,680 --> 00:38:24,000 Speaker 1: would likely mean we would see autonomous vehicles largely confined 603 00:38:24,160 --> 00:38:27,920 Speaker 1: to a specific region that geo fencing approach. They wouldn't 604 00:38:27,960 --> 00:38:31,799 Speaker 1: necessarily be able to travel outside of those regions, not 605 00:38:31,920 --> 00:38:35,280 Speaker 1: without having a robust enough system on board. The vehicle 606 00:38:35,320 --> 00:38:38,520 Speaker 1: itself to handle everything when it, you know, kind of 607 00:38:38,600 --> 00:38:42,040 Speaker 1: loses connectivity. And that's a really big part of it too. 608 00:38:42,080 --> 00:38:45,040 Speaker 1: The fact is, all sorts of weird stuff happens on 609 00:38:45,080 --> 00:38:49,000 Speaker 1: the roads, and engineers are not able to anticipate all 610 00:38:49,040 --> 00:38:51,759 Speaker 1: of them. None of us are. There's just no practical 611 00:38:51,880 --> 00:38:55,640 Speaker 1: or even possible way to preprogram the right response to 612 00:38:55,920 --> 00:38:59,120 Speaker 1: every possible thing that could happen on the roads. This 613 00:38:59,160 --> 00:39:03,200 Speaker 1: is another reason why geo fencing is an important solution, 614 00:39:03,480 --> 00:39:05,600 Speaker 1: or at least a temporary one, because it limits the 615 00:39:05,640 --> 00:39:09,680 Speaker 1: operational range of an autonomous vehicle. That also means it 616 00:39:09,719 --> 00:39:13,480 Speaker 1: limits the number and variety of weird stuff that that 617 00:39:13,600 --> 00:39:17,560 Speaker 1: vehicle could potentially encounter. It's not gonna eliminate weird stuff. 618 00:39:17,840 --> 00:39:22,279 Speaker 1: It'll just limited to mostly a subsection of all the 619 00:39:22,320 --> 00:39:29,480 Speaker 1: weird stuff. Right, Eliminating variables makes the whole problem less complicated. 620 00:39:29,520 --> 00:39:32,680 Speaker 1: I was gonna say easier, but that's the wrong implication. 621 00:39:33,120 --> 00:39:36,880 Speaker 1: Less complicated is probably the better wording, because it's still 622 00:39:37,120 --> 00:39:41,080 Speaker 1: wicked complicated. Then we have to get through all these 623 00:39:41,120 --> 00:39:46,560 Speaker 1: social components to making driverless vehicles an acceptable reality. According 624 00:39:46,600 --> 00:39:52,600 Speaker 1: to Alex Kopatinski of Policy Advice around fort of people 625 00:39:52,640 --> 00:39:55,279 Speaker 1: in the United States do not feel safe in a 626 00:39:55,400 --> 00:39:58,560 Speaker 1: driverless car. They do not like the idea of getting 627 00:39:58,560 --> 00:40:03,839 Speaker 1: into a driverless vehicle. I actually think for being uncomfortable 628 00:40:03,960 --> 00:40:06,759 Speaker 1: is pretty low. I would have expected that more than 629 00:40:06,840 --> 00:40:09,600 Speaker 1: half of people in the United States worry about their 630 00:40:09,640 --> 00:40:12,440 Speaker 1: safety if they were to get into a driverless vehicle. 631 00:40:13,360 --> 00:40:16,560 Speaker 1: Now that's not based on any hard data. That's just 632 00:40:16,680 --> 00:40:19,440 Speaker 1: my my feeling, which we all know is not really 633 00:40:19,560 --> 00:40:22,960 Speaker 1: that's not evidence, it's not worth while. I just feel 634 00:40:23,000 --> 00:40:28,480 Speaker 1: forty feels uh really generous. And that's because whenever a 635 00:40:28,600 --> 00:40:31,800 Speaker 1: driverless vehicle gets in an accident, we see a disproportionate 636 00:40:31,840 --> 00:40:36,160 Speaker 1: amount of coverage about that accident. Now, we all know 637 00:40:36,760 --> 00:40:41,200 Speaker 1: fatal accidents due to human error happen far too frequently. 638 00:40:41,239 --> 00:40:45,880 Speaker 1: They're happening every day, lots of them, with thirty eight thousand, 639 00:40:46,760 --> 00:40:52,160 Speaker 1: uh nearly thirty nine thousand happening in lots are happening 640 00:40:52,160 --> 00:40:55,120 Speaker 1: every single day, and they typically do not receive nearly 641 00:40:55,239 --> 00:40:59,480 Speaker 1: as much coverage as an autonomous vehicle accident beyond maybe 642 00:40:59,560 --> 00:41:03,000 Speaker 1: local reports. Your local news might cover it, but and 643 00:41:03,080 --> 00:41:05,440 Speaker 1: sometimes it might just be something like a traffic report, 644 00:41:06,239 --> 00:41:08,520 Speaker 1: but you don't get the coverage that you would get 645 00:41:08,520 --> 00:41:11,600 Speaker 1: if it were an autonomous vehicle. When an autonomous vehicles 646 00:41:11,640 --> 00:41:16,120 Speaker 1: at fault, that becomes national or maybe even global news. 647 00:41:16,160 --> 00:41:19,680 Speaker 1: So there is a disparity going on here. Right. You 648 00:41:19,719 --> 00:41:24,960 Speaker 1: could argue that the autonomous vehicles in in aggregate are 649 00:41:25,040 --> 00:41:31,720 Speaker 1: far more safe than human driven vehicles, but the news 650 00:41:31,719 --> 00:41:35,400 Speaker 1: reporting has an amazing disparity there. There's going to be 651 00:41:35,480 --> 00:41:39,000 Speaker 1: way more reporting on autonomous vehicle accidents than on your 652 00:41:39,040 --> 00:41:44,439 Speaker 1: typical human caused accident. Still, even at you're still looking 653 00:41:44,480 --> 00:41:46,959 Speaker 1: at a ton of people who would feel unsafe getting 654 00:41:47,000 --> 00:41:50,120 Speaker 1: into a driverless car, and whether that worry is justified 655 00:41:50,280 --> 00:41:53,640 Speaker 1: or not, it poses as a challenge when it comes 656 00:41:53,680 --> 00:41:57,200 Speaker 1: to policy. See another really big component in the adoption 657 00:41:57,239 --> 00:42:00,960 Speaker 1: of drive list car technology is getting politic Aisians to 658 00:42:01,440 --> 00:42:05,600 Speaker 1: support it and to develop laws that create the guidelines 659 00:42:05,640 --> 00:42:10,320 Speaker 1: around it. Many places lack laws that would cover autonomous 660 00:42:10,400 --> 00:42:13,680 Speaker 1: driving vehicles. That gap is something to be concerned about. 661 00:42:13,760 --> 00:42:17,600 Speaker 1: It means that if something happens, like if an autonomous 662 00:42:17,680 --> 00:42:20,520 Speaker 1: vehicle gets into an accident in one of those places, 663 00:42:21,000 --> 00:42:24,480 Speaker 1: you would have no framework to establish things like accountability. 664 00:42:24,640 --> 00:42:27,680 Speaker 1: That's just one tiny slice of how it's important to 665 00:42:27,800 --> 00:42:31,920 Speaker 1: establish policy so that there is a legal infrastructure around 666 00:42:31,960 --> 00:42:37,040 Speaker 1: autonomous vehicle operation. That legal structure would in turn inform 667 00:42:37,160 --> 00:42:40,959 Speaker 1: manufacturers as to certain requirements their vehicles would have to meet, 668 00:42:41,440 --> 00:42:45,080 Speaker 1: and that would mean that these policies would also shape technology. 669 00:42:45,160 --> 00:42:47,920 Speaker 1: Wouldn't just be how can we make this technology do 670 00:42:48,040 --> 00:42:50,399 Speaker 1: the thing we wanted to do, It would be how 671 00:42:50,440 --> 00:42:52,560 Speaker 1: can we make this technology do the thing we wanted 672 00:42:52,600 --> 00:42:57,480 Speaker 1: to do within the legal framework. So that's an important 673 00:42:57,520 --> 00:43:00,000 Speaker 1: thing that has to happen too, And in some places 674 00:43:00,320 --> 00:43:03,239 Speaker 1: we're seeing that. In other places there's not really any 675 00:43:03,239 --> 00:43:07,279 Speaker 1: official policy. If the policies don't match each other, that 676 00:43:07,400 --> 00:43:10,880 Speaker 1: also creates a complication, because what do you do in 677 00:43:10,880 --> 00:43:13,800 Speaker 1: a case where an autonomous vehicle that's legal to operate 678 00:43:13,880 --> 00:43:16,960 Speaker 1: in one place ventures into a place where it's not 679 00:43:17,120 --> 00:43:20,200 Speaker 1: legal to operate because it doesn't fully meet all the requirements. 680 00:43:20,480 --> 00:43:24,319 Speaker 1: These are tough questions now. Way back in the day, 681 00:43:24,880 --> 00:43:27,000 Speaker 1: if you listen to tech stuff, you know, I was 682 00:43:27,160 --> 00:43:30,520 Speaker 1: really super hyped up for autonomous vehicles. I was so 683 00:43:30,640 --> 00:43:34,439 Speaker 1: excited about it, and that's because I recognized the fallibility 684 00:43:34,440 --> 00:43:37,440 Speaker 1: of humans. I recognize that we make mistakes, and the 685 00:43:37,480 --> 00:43:40,840 Speaker 1: fact that technology can detect and react at a speed 686 00:43:41,120 --> 00:43:44,640 Speaker 1: that is impossible for us to even imagine, meant the 687 00:43:45,280 --> 00:43:49,680 Speaker 1: potential for this technology to save thousands of lives every year. 688 00:43:50,040 --> 00:43:53,520 Speaker 1: And based off that very limited perspective, yeah, autonomous cars 689 00:43:53,680 --> 00:43:57,440 Speaker 1: are phenomenal, but it also ignores all the other things 690 00:43:57,440 --> 00:44:00,360 Speaker 1: that you have to take into consideration that driving, say fleet, 691 00:44:00,840 --> 00:44:04,040 Speaker 1: is about way more than just seeing something and then 692 00:44:04,120 --> 00:44:07,680 Speaker 1: reacting to it in time. And that's really where I 693 00:44:07,719 --> 00:44:11,120 Speaker 1: was being blind. I bought into the hype and was 694 00:44:11,239 --> 00:44:14,520 Speaker 1: eagerly looking forward to one when I'd be able to 695 00:44:14,520 --> 00:44:17,160 Speaker 1: hop into a driverless car and go wherever I wanted. 696 00:44:18,120 --> 00:44:22,160 Speaker 1: These days, i'm I would say, I'm more cautiously optimistic. 697 00:44:23,160 --> 00:44:25,360 Speaker 1: I do still think we're going to get to a 698 00:44:25,400 --> 00:44:28,919 Speaker 1: future where autonomous vehicles will play a more central role 699 00:44:29,120 --> 00:44:32,719 Speaker 1: in how we get around locally, and in fact, it 700 00:44:32,800 --> 00:44:35,560 Speaker 1: might it might even cut way back on the number 701 00:44:35,600 --> 00:44:38,239 Speaker 1: of vehicles we have on the road in the long run, 702 00:44:38,360 --> 00:44:40,640 Speaker 1: But I think that's going to take ages to play out. 703 00:44:41,760 --> 00:44:45,640 Speaker 1: Maybe we'll see those visions of a future where parking 704 00:44:45,640 --> 00:44:49,160 Speaker 1: lots are reclaimed and turned into stuff like parks and whatnot. 705 00:44:49,280 --> 00:44:52,560 Speaker 1: Maybe we'll see that come true, Because maybe we will 706 00:44:52,600 --> 00:44:54,840 Speaker 1: see a future where we have a fleet of autonomous 707 00:44:54,880 --> 00:44:58,680 Speaker 1: vehicles that just kind of are there on demand whenever 708 00:44:58,719 --> 00:45:03,399 Speaker 1: we need them, and then go somewhere when we don't. 709 00:45:03,480 --> 00:45:05,560 Speaker 1: That somewhere is a big question mark by the way, 710 00:45:05,680 --> 00:45:08,160 Speaker 1: that we haven't answered, and that we don't own our 711 00:45:08,200 --> 00:45:12,399 Speaker 1: own car, right, we just use driverless vehicles as an 712 00:45:12,440 --> 00:45:15,120 Speaker 1: on demand service whenever we need to get somewhere. It 713 00:45:15,120 --> 00:45:17,879 Speaker 1: will be priced at a point where it would make 714 00:45:18,520 --> 00:45:20,520 Speaker 1: sense to do that, where you know it would be 715 00:45:20,560 --> 00:45:24,440 Speaker 1: at least uh no more expensive than owning and maintaining 716 00:45:24,440 --> 00:45:28,120 Speaker 1: your own vehicle, and preferably less expensive than that. For 717 00:45:28,200 --> 00:45:31,319 Speaker 1: that to happen, a lot of stuff has to fall 718 00:45:31,320 --> 00:45:33,400 Speaker 1: into place, and it goes well beyond the scope of 719 00:45:33,400 --> 00:45:34,960 Speaker 1: this episode, so I'm not going to go into it. 720 00:45:35,000 --> 00:45:38,640 Speaker 1: But let's say that all these things are actually able 721 00:45:38,680 --> 00:45:43,920 Speaker 1: to come true. If that's in fact a possibility, it's 722 00:45:43,920 --> 00:45:47,160 Speaker 1: gonna take decades, if not longer, for us to get there. 723 00:45:47,440 --> 00:45:50,920 Speaker 1: We we've just got we've got so much inertia built 724 00:45:51,000 --> 00:45:54,720 Speaker 1: up that we have to get through that in order 725 00:45:54,800 --> 00:45:57,360 Speaker 1: to make that future reality. And we have to remind 726 00:45:57,400 --> 00:46:00,680 Speaker 1: ourselves that to achieve that vision, it's going to require 727 00:46:00,719 --> 00:46:03,279 Speaker 1: a ton of work and the solution to a lot 728 00:46:03,360 --> 00:46:06,640 Speaker 1: of non trivial problems. That is not a reason to 729 00:46:06,680 --> 00:46:09,439 Speaker 1: give up, but it is a good reason to check 730 00:46:09,480 --> 00:46:11,960 Speaker 1: our expectations and to look at things from a more 731 00:46:12,000 --> 00:46:15,800 Speaker 1: critical point of view, and to approach things that way. 732 00:46:15,840 --> 00:46:20,560 Speaker 1: That way, when we're really being critical, when we're really 733 00:46:20,600 --> 00:46:25,440 Speaker 1: acknowledging the challenges that face us, were more equipped to 734 00:46:25,640 --> 00:46:29,799 Speaker 1: meet those challenges, to find solutions, and to work through it. 735 00:46:30,480 --> 00:46:33,319 Speaker 1: If we fall into the trap of being overhyped, like 736 00:46:33,400 --> 00:46:36,719 Speaker 1: I was back in the day, then we're not gonna 737 00:46:36,960 --> 00:46:40,640 Speaker 1: realize those challenges until they are undeniable, and when we've 738 00:46:40,640 --> 00:46:45,400 Speaker 1: wasted time, money, and resources on things that were trivial 739 00:46:45,680 --> 00:46:49,560 Speaker 1: in the in the long term. So still, I'm still 740 00:46:49,640 --> 00:46:53,520 Speaker 1: up on autonomous vehicles. I still I'm eager to see 741 00:46:53,520 --> 00:46:56,399 Speaker 1: them become a thing, not entirely sure that they're going 742 00:46:56,440 --> 00:46:59,160 Speaker 1: to be an effective thing within my lifetime. I mean, 743 00:46:59,160 --> 00:47:02,600 Speaker 1: they will exist, they will be out there. I just 744 00:47:02,640 --> 00:47:05,600 Speaker 1: don't know at what sort of density it will be. 745 00:47:06,120 --> 00:47:09,440 Speaker 1: But I do think it's a worthwhile pursuit. I think 746 00:47:09,480 --> 00:47:14,200 Speaker 1: the benefits are undeniable if we are able to solve 747 00:47:14,239 --> 00:47:18,479 Speaker 1: the problems. All Right, That wraps up this catch up 748 00:47:18,520 --> 00:47:22,680 Speaker 1: on autonomous vehicles. Hope you enjoyed it as a reminder, 749 00:47:23,280 --> 00:47:25,879 Speaker 1: if you listen to this show on the I heart 750 00:47:25,960 --> 00:47:28,279 Speaker 1: Radio app, really, if you just download the i heart 751 00:47:28,320 --> 00:47:31,359 Speaker 1: Radio app, you can use that talkback feature to leave 752 00:47:31,400 --> 00:47:34,160 Speaker 1: comments either on this episode or on the show in general, 753 00:47:34,840 --> 00:47:37,640 Speaker 1: and it can be up to thirty seconds and I'll 754 00:47:37,680 --> 00:47:40,120 Speaker 1: be able to hear it and maybe even use it 755 00:47:40,120 --> 00:47:43,080 Speaker 1: in a future episode if you would like that. Also, 756 00:47:43,239 --> 00:47:46,040 Speaker 1: if you would prefer some other method, the other tried 757 00:47:46,080 --> 00:47:47,680 Speaker 1: and true way to get in touch with the show 758 00:47:47,880 --> 00:47:50,880 Speaker 1: is through Twitter. The handle for the show is text 759 00:47:50,880 --> 00:47:54,000 Speaker 1: stuff h s W and I'll talk to you again 760 00:47:54,800 --> 00:48:03,000 Speaker 1: really soon. Text Stuff is an I heart Radio production. 761 00:48:03,200 --> 00:48:06,040 Speaker 1: For more podcasts from I Heart Radio, visit the i 762 00:48:06,160 --> 00:48:09,400 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 763 00:48:09,440 --> 00:48:10,360 Speaker 1: your favorite shows.