1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,120 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:14,200 --> 00:00:16,720 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:16,880 --> 00:00:21,000 Speaker 1: And how the tech are you? You know, we've been 5 00:00:21,040 --> 00:00:24,279 Speaker 1: talking about driverless cars for quite a few years now, 6 00:00:24,640 --> 00:00:28,320 Speaker 1: Even on this show and in previous episodes of tech Stuff, 7 00:00:28,360 --> 00:00:31,560 Speaker 1: I've talked about the various levels of autonomy, which range 8 00:00:31,600 --> 00:00:35,160 Speaker 1: from zero, which means the vehicle in question really has 9 00:00:35,240 --> 00:00:38,960 Speaker 1: no driver assist features at all, and the human behind 10 00:00:39,000 --> 00:00:41,519 Speaker 1: the wheel is responsible for everything, you know, barring some 11 00:00:41,560 --> 00:00:45,800 Speaker 1: sort of mechanical issue at any rate, and then you 12 00:00:45,840 --> 00:00:48,760 Speaker 1: go all the way up to level five autonomy, which 13 00:00:48,760 --> 00:00:50,519 Speaker 1: is a car that's so smart you could enroll it 14 00:00:50,560 --> 00:00:54,160 Speaker 1: in Harvard. No. But seriously, a level five autonomous vehicle 15 00:00:54,640 --> 00:00:57,880 Speaker 1: would have no need for human accessible controls. It would 16 00:00:57,880 --> 00:01:04,360 Speaker 1: just transport people and our cargo wherever, under any driving conditions, 17 00:01:04,400 --> 00:01:08,800 Speaker 1: at any time of day or night. Where currently hovering 18 00:01:09,160 --> 00:01:12,720 Speaker 1: around levels two and three at the moment, mostly there 19 00:01:12,760 --> 00:01:15,640 Speaker 1: are pushes to get to level four, but as it 20 00:01:15,680 --> 00:01:19,640 Speaker 1: has turned out, the technology necessary to create truly autonomous 21 00:01:19,720 --> 00:01:24,520 Speaker 1: vehicles that can operate anywhere at any time is harder 22 00:01:24,560 --> 00:01:28,960 Speaker 1: than some of us myself included, initially thought and most 23 00:01:29,000 --> 00:01:32,160 Speaker 1: of our discussions about driverless cars treat the vehicles as 24 00:01:32,640 --> 00:01:35,760 Speaker 1: you know, self contained computing systems that just happened to 25 00:01:35,760 --> 00:01:38,800 Speaker 1: be on the go. So by that, I mean the 26 00:01:38,880 --> 00:01:42,399 Speaker 1: car mostly is relying on its own sensors to understand 27 00:01:42,600 --> 00:01:45,760 Speaker 1: what is going on around it. The car's decision making 28 00:01:45,800 --> 00:01:49,720 Speaker 1: systems are fully contained within the vehicle itself. And you 29 00:01:49,760 --> 00:01:52,560 Speaker 1: can think of a highway filled with those types of 30 00:01:52,600 --> 00:01:55,600 Speaker 1: autonomous vehicles being kind of similar to a room filled 31 00:01:55,600 --> 00:01:58,320 Speaker 1: with computers that are not connected to each other, so 32 00:01:58,360 --> 00:02:01,360 Speaker 1: they are not network. They're all individ jewel computers. Each 33 00:02:01,400 --> 00:02:05,000 Speaker 1: machine is sophisticated and it's capable, but it's also independent 34 00:02:05,080 --> 00:02:09,639 Speaker 1: of all the others. Other visions of a driverless future 35 00:02:09,960 --> 00:02:14,359 Speaker 1: involved cars that can communicate with one another directly or 36 00:02:14,400 --> 00:02:19,240 Speaker 1: even potentially with roads and infrastructure. These cars would depend 37 00:02:19,360 --> 00:02:22,720 Speaker 1: not just on their own sensors, but those are the 38 00:02:22,800 --> 00:02:26,680 Speaker 1: vehicles around it. They would remain in constant communication to 39 00:02:26,720 --> 00:02:30,960 Speaker 1: provide the smoothest, safest, and most efficient ride from point 40 00:02:31,040 --> 00:02:32,840 Speaker 1: A to point B, so they would be kind of 41 00:02:33,400 --> 00:02:36,920 Speaker 1: part of a mesh network. And to get to that future, 42 00:02:37,200 --> 00:02:40,440 Speaker 1: we would need to agree upon a standardized protocol for 43 00:02:40,480 --> 00:02:43,760 Speaker 1: these cars to follow. Otherwise you could have a world 44 00:02:43,840 --> 00:02:47,600 Speaker 1: where Toyotas wouldn't talk to Fords, they would only talk 45 00:02:47,600 --> 00:02:50,239 Speaker 1: to other Toyota's, and that kind of thing. That would 46 00:02:50,280 --> 00:02:53,720 Speaker 1: be a nightmare, that would not be helpful. Anyway, That's 47 00:02:53,840 --> 00:02:56,480 Speaker 1: not the direction that most companies are going in right now, 48 00:02:57,000 --> 00:03:00,120 Speaker 1: though I suspect some companies that are more focusing on 49 00:03:00,400 --> 00:03:04,320 Speaker 1: creating autonomous transportation vehicles for stuff like cargo, or at 50 00:03:04,400 --> 00:03:09,040 Speaker 1: least looking at this from a fleetwide perspective that could 51 00:03:09,080 --> 00:03:13,880 Speaker 1: serve a similar purpose, but specifically for this particular you know, 52 00:03:13,960 --> 00:03:17,760 Speaker 1: business model. But today I wanted to talk about a 53 00:03:17,760 --> 00:03:20,760 Speaker 1: slightly different approach. Uh And I've talked about this a 54 00:03:20,800 --> 00:03:23,200 Speaker 1: little bit in the past when I've talked about autonomous cars. 55 00:03:23,200 --> 00:03:27,400 Speaker 1: So this one has had several different incarnations over the years, 56 00:03:27,440 --> 00:03:32,079 Speaker 1: and it's it's a cool idea on the surface, but 57 00:03:33,040 --> 00:03:38,160 Speaker 1: it's probably the least practical for driver lest vehicles or 58 00:03:38,200 --> 00:03:41,440 Speaker 1: autonomous vehicles insofar as the amount of work we would 59 00:03:41,440 --> 00:03:44,960 Speaker 1: need to do on an infrastructure level and the amount 60 00:03:45,040 --> 00:03:48,120 Speaker 1: of consensus we would have to have across all parties 61 00:03:48,120 --> 00:03:51,040 Speaker 1: in the space but this is the concept of the 62 00:03:51,120 --> 00:03:56,320 Speaker 1: automated highway system. With this kind of system, you would 63 00:03:56,320 --> 00:04:02,480 Speaker 1: have embedded networks that are along or under or above highways, 64 00:04:02,520 --> 00:04:06,080 Speaker 1: and these systems would provide guidance to the vehicles on 65 00:04:06,120 --> 00:04:09,760 Speaker 1: the road, perhaps even to the extent of controlling all 66 00:04:09,800 --> 00:04:13,800 Speaker 1: the vehicles on the road. So and that kind of 67 00:04:13,800 --> 00:04:16,320 Speaker 1: a system, the road would be the one doing the driving. 68 00:04:16,360 --> 00:04:19,200 Speaker 1: In other words, it would send commands to all the 69 00:04:19,279 --> 00:04:23,080 Speaker 1: vehicles traveling on the lanes, which would follow those commands. 70 00:04:23,400 --> 00:04:26,680 Speaker 1: And it would be kind of like an automated air 71 00:04:26,720 --> 00:04:30,680 Speaker 1: traffic control system, with the automated highway guiding vehicles on 72 00:04:30,800 --> 00:04:33,360 Speaker 1: and off the highway seamlessly in a way to avoid 73 00:04:33,440 --> 00:04:38,240 Speaker 1: traffic congestion and to avoid accidents. And it's a neat idea. 74 00:04:38,640 --> 00:04:43,000 Speaker 1: It's just I won't I won't say impossible, but about 75 00:04:43,040 --> 00:04:45,320 Speaker 1: as close to impossible as you can get without it 76 00:04:45,360 --> 00:04:48,279 Speaker 1: being impossible to do. But that doesn't mean that folks 77 00:04:48,320 --> 00:04:52,920 Speaker 1: haven't tried. The idea itself is actually coming up on 78 00:04:53,040 --> 00:04:56,440 Speaker 1: being a century old. I mean, okay, we still have 79 00:04:56,520 --> 00:04:59,560 Speaker 1: seventeen years to go, and yees, seventeen years for for 80 00:04:59,640 --> 00:05:01,400 Speaker 1: many of you will seem like a really long time. 81 00:05:01,440 --> 00:05:04,400 Speaker 1: But the older I get, the less the less long. 82 00:05:04,520 --> 00:05:07,920 Speaker 1: Seventeen years sounds to me, but we are getting close 83 00:05:07,960 --> 00:05:12,839 Speaker 1: to a century of following alone this idea. So back 84 00:05:12,880 --> 00:05:17,799 Speaker 1: in nine GM sponsored a pavilion at the World's Fair 85 00:05:17,960 --> 00:05:20,240 Speaker 1: in New York. I've talked about this in past Tech 86 00:05:20,240 --> 00:05:23,440 Speaker 1: Stuff episodes too, but it's one of my favorite historical 87 00:05:23,560 --> 00:05:28,240 Speaker 1: examples of futuristic thinking. So GM had a pavilion that 88 00:05:28,320 --> 00:05:33,000 Speaker 1: was called the Highways and Horizons Pavilion, and the centerpiece 89 00:05:33,560 --> 00:05:38,680 Speaker 1: of this pavilion was an exhibit called Futurama. Bum bum 90 00:05:38,720 --> 00:05:42,840 Speaker 1: bum bum bum. That's all I can hum of that 91 00:05:42,880 --> 00:05:47,360 Speaker 1: before I get a d m C A strike. Anyway, Futurama, 92 00:05:47,360 --> 00:05:50,320 Speaker 1: in this case was this massive model of what a 93 00:05:50,400 --> 00:05:54,520 Speaker 1: landscape of the future could look like, and it included, 94 00:05:54,960 --> 00:05:58,960 Speaker 1: you know, city scapes and rural areas, included farms and 95 00:05:59,040 --> 00:06:04,239 Speaker 1: hydro electric facilities like it was expansive. It was huge 96 00:06:05,000 --> 00:06:10,279 Speaker 1: and incredible and included some moving components to A theatrical 97 00:06:10,320 --> 00:06:14,960 Speaker 1: designer named Norman Bell get Is masterminded this exhibit and 98 00:06:15,040 --> 00:06:19,360 Speaker 1: get his he was a real forward thinker. You can 99 00:06:19,440 --> 00:06:23,719 Speaker 1: actually watch a full film that details the Futurama model 100 00:06:23,880 --> 00:06:28,360 Speaker 1: on YouTube. It's about five minutes long. And you can 101 00:06:28,400 --> 00:06:32,560 Speaker 1: even tell that this film inspired lots of later attractions, 102 00:06:32,600 --> 00:06:36,200 Speaker 1: particularly the ones that opened with the debut of EPCOT 103 00:06:36,320 --> 00:06:39,280 Speaker 1: way back when at Walt Disney World. If you've ever 104 00:06:39,320 --> 00:06:41,680 Speaker 1: been on any of those old EPCOT rides, or you 105 00:06:41,760 --> 00:06:44,880 Speaker 1: watched videos of those old epp cut rides, they owe 106 00:06:44,920 --> 00:06:49,400 Speaker 1: a lot to this particular film of the Futurama exhibit 107 00:06:49,440 --> 00:06:54,360 Speaker 1: from way back in. Anyway, the film covers a lot 108 00:06:54,360 --> 00:06:57,599 Speaker 1: of different bits, but the part we're interested in is 109 00:06:57,640 --> 00:07:01,480 Speaker 1: this idea of automated highways. Now, keep in mind this, 110 00:07:01,480 --> 00:07:04,479 Speaker 1: this blue sky automated highway concept came at a time 111 00:07:04,480 --> 00:07:09,560 Speaker 1: when highways themselves, we're really new. In the United States, 112 00:07:09,600 --> 00:07:12,560 Speaker 1: the US was in the process of laying out highways 113 00:07:12,640 --> 00:07:16,160 Speaker 1: in nineteen thirty nine to connect major cities together, even 114 00:07:16,160 --> 00:07:21,480 Speaker 1: as get as his exhibit suggested a revolutionary approach. Also, 115 00:07:21,560 --> 00:07:24,240 Speaker 1: I want to add a note here. While the view 116 00:07:24,280 --> 00:07:27,120 Speaker 1: of the future was very forward thinking, if you watch 117 00:07:27,160 --> 00:07:31,160 Speaker 1: this film, you're like, wow, this was, you know, really imaginative, 118 00:07:31,920 --> 00:07:36,320 Speaker 1: the social concepts definitely were not forward thinking. The film. 119 00:07:36,840 --> 00:07:40,080 Speaker 1: Um that's about the Futurama exhibit. It pretty much outright 120 00:07:40,120 --> 00:07:42,720 Speaker 1: says that men are behind all the progress. It just 121 00:07:42,800 --> 00:07:46,440 Speaker 1: uses men as the default, and clearly that ignores the 122 00:07:46,440 --> 00:07:51,520 Speaker 1: fact that countless others besides men made significant contributions up 123 00:07:51,560 --> 00:07:55,200 Speaker 1: to ninety nine and and since and uh yeah, I 124 00:07:55,240 --> 00:07:57,920 Speaker 1: just want to call that out because you know we should. 125 00:07:58,400 --> 00:08:01,600 Speaker 1: So it's It's certainly not a perfect film by any 126 00:08:01,600 --> 00:08:04,640 Speaker 1: stretch of the imagination. However, the exhibit gives us a 127 00:08:04,760 --> 00:08:09,280 Speaker 1: view of the far off future of nineteen sixty. Keep 128 00:08:09,320 --> 00:08:12,880 Speaker 1: in mind that was twenty years away when this exhibit 129 00:08:12,920 --> 00:08:17,120 Speaker 1: was being shown, and the motorways in this exhibit allow 130 00:08:17,240 --> 00:08:21,280 Speaker 1: cars to travel at designated speeds. The accompanying film says 131 00:08:21,360 --> 00:08:26,680 Speaker 1: that those speeds are fifty miles per hour and they 132 00:08:26,720 --> 00:08:30,360 Speaker 1: will maintain proper distance from other cars through quote unquote 133 00:08:30,480 --> 00:08:36,199 Speaker 1: radio control. So presumably the motorway itself is the transmitter 134 00:08:36,240 --> 00:08:40,000 Speaker 1: and the cars are receivers. Alternatively, the cars could be 135 00:08:40,040 --> 00:08:43,080 Speaker 1: both transmitters and receivers and be communicating with each other. 136 00:08:43,840 --> 00:08:46,839 Speaker 1: It's not entirely clear. Now, you could argue that the 137 00:08:46,920 --> 00:08:52,240 Speaker 1: nine exhibit was marketing and it was just overly optimistic 138 00:08:52,240 --> 00:08:56,200 Speaker 1: projections about the future for the purposes of marketing for GM, 139 00:08:56,240 --> 00:08:58,360 Speaker 1: And I think you would have a pretty solid argument 140 00:08:58,400 --> 00:09:01,960 Speaker 1: if you said that, I'm not or anyone involved really 141 00:09:02,120 --> 00:09:05,480 Speaker 1: believed that this particular vision of the future was what 142 00:09:05,520 --> 00:09:08,400 Speaker 1: was going to happen by nineteen sixty. But then it's 143 00:09:08,440 --> 00:09:10,560 Speaker 1: always hard to protect the future, even if it's just 144 00:09:10,600 --> 00:09:13,880 Speaker 1: a short ways out. Also, the US of nineteen nine 145 00:09:13,960 --> 00:09:17,439 Speaker 1: was entering into very uncertain territory. It was recovering from 146 00:09:17,480 --> 00:09:20,760 Speaker 1: the Great Depression, which was finally coming to an end 147 00:09:20,800 --> 00:09:24,600 Speaker 1: after a decade of economic turmoil, and the US had 148 00:09:24,640 --> 00:09:28,200 Speaker 1: not yet entered into World War Two, And obviously the 149 00:09:28,240 --> 00:09:32,080 Speaker 1: experience of World War Two would affect the American point 150 00:09:32,080 --> 00:09:36,679 Speaker 1: of view significantly. It would really boost xenophobia for one thing. 151 00:09:37,160 --> 00:09:40,240 Speaker 1: But let's skip on ahead while get us his vision 152 00:09:40,280 --> 00:09:43,520 Speaker 1: of nineteen sixty did not manifest in reality. We did 153 00:09:43,559 --> 00:09:47,400 Speaker 1: not get these automated highways by nineteen sixty. There were 154 00:09:47,440 --> 00:09:51,360 Speaker 1: folks who were working on the concept of automated highways, 155 00:09:51,400 --> 00:09:55,239 Speaker 1: at least researching ways where perhaps it could be possible, 156 00:09:55,880 --> 00:09:59,920 Speaker 1: and that was mostly focusing on mechanical systems and radio 157 00:10:00,000 --> 00:10:05,319 Speaker 1: control systems. And over time, as computers began to evolve 158 00:10:05,440 --> 00:10:09,560 Speaker 1: and become more commonplace, there was a transition toward looking 159 00:10:09,559 --> 00:10:13,160 Speaker 1: at computer controlled systems, the hope that maybe a computer 160 00:10:13,240 --> 00:10:17,440 Speaker 1: would be sophisticated enough to handle the complex operations that 161 00:10:17,440 --> 00:10:21,840 Speaker 1: would be required to have a truly automated highway system. 162 00:10:21,880 --> 00:10:25,880 Speaker 1: But computers themselves were really large and limited early on. 163 00:10:26,280 --> 00:10:30,680 Speaker 1: As they evolved, there was also another big shift, and 164 00:10:30,720 --> 00:10:33,440 Speaker 1: I'll talk about that after we come back from this 165 00:10:33,520 --> 00:10:45,320 Speaker 1: quick break. Okay, before the break, I was talking about 166 00:10:45,320 --> 00:10:49,520 Speaker 1: how as computer systems were starting to get more sophisticated 167 00:10:49,559 --> 00:10:53,880 Speaker 1: and smaller, mentorization being a big part of this, we 168 00:10:53,920 --> 00:10:56,520 Speaker 1: started to see a shift, and that shift would end 169 00:10:56,600 --> 00:11:01,760 Speaker 1: up being a really significant one as far as the 170 00:11:01,800 --> 00:11:07,320 Speaker 1: concept and plausibility of automated highway systems are concerned. And 171 00:11:07,360 --> 00:11:09,840 Speaker 1: that was that car companies were starting to design and 172 00:11:09,960 --> 00:11:14,960 Speaker 1: integrate driver's safety features in vehicles, and these features bordered 173 00:11:15,040 --> 00:11:19,920 Speaker 1: or sometimes overlapped the autonomous car concept. So, for example, 174 00:11:19,960 --> 00:11:23,559 Speaker 1: cruise control. The actual idea of cruise control is almost 175 00:11:23,559 --> 00:11:26,800 Speaker 1: as old as the car itself. It's essentially a way 176 00:11:26,880 --> 00:11:31,400 Speaker 1: of having a car maintain a set speed. In some cases, 177 00:11:31,480 --> 00:11:33,960 Speaker 1: it was a set speed that the driver couldn't choose, 178 00:11:34,120 --> 00:11:38,120 Speaker 1: like it was either full throttle or not. But then 179 00:11:38,240 --> 00:11:41,920 Speaker 1: later you had more sophisticated systems that let you set 180 00:11:41,960 --> 00:11:45,319 Speaker 1: cruise control at a speed of your choosing, and your 181 00:11:45,400 --> 00:11:47,480 Speaker 1: vehicle would just maintain that speed. Now, that was just 182 00:11:47,559 --> 00:11:51,839 Speaker 1: one tiny step toward creating systems that would augment a 183 00:11:51,920 --> 00:11:55,520 Speaker 1: driver's ability to navigate the roads safely and more over, 184 00:11:56,400 --> 00:12:00,760 Speaker 1: service kind of building block toward autonomous cars. Obviously, a 185 00:12:00,840 --> 00:12:04,040 Speaker 1: lot more systems would be needed in order to even 186 00:12:04,120 --> 00:12:07,400 Speaker 1: remotely consider a car to be autonomous, but these were 187 00:12:07,400 --> 00:12:09,680 Speaker 1: the steps that we're leading in that direction, and we 188 00:12:09,720 --> 00:12:13,160 Speaker 1: would see that trend continue over time. Cars would get 189 00:12:13,840 --> 00:12:17,160 Speaker 1: more systems that would make them safer and be able 190 00:12:17,200 --> 00:12:20,120 Speaker 1: to take over certain operations. You know, cars would get 191 00:12:20,120 --> 00:12:23,360 Speaker 1: integrated cameras, which would help when you're backing out of 192 00:12:23,360 --> 00:12:26,920 Speaker 1: the space for example. They'd get collision sensors, they get 193 00:12:27,000 --> 00:12:31,360 Speaker 1: lane detection sensors, and much more so. The cars were 194 00:12:31,480 --> 00:12:36,040 Speaker 1: evolving into smart machines that could operate on essentially a 195 00:12:36,160 --> 00:12:38,800 Speaker 1: dumb network. Right. You can think of the roads and 196 00:12:38,880 --> 00:12:44,199 Speaker 1: highway systems as a network, and the network itself is 197 00:12:44,360 --> 00:12:47,319 Speaker 1: fairly robust. I mean, if you drive in Atlanta, you 198 00:12:47,360 --> 00:12:49,480 Speaker 1: might question that with all the potholes, but you get 199 00:12:49,480 --> 00:12:52,600 Speaker 1: what I'm saying. It's a robust system, but it's dumb. 200 00:12:52,840 --> 00:12:57,280 Speaker 1: It does not have any real intelligent components itself. And 201 00:12:57,320 --> 00:13:01,680 Speaker 1: the cars became the smart art components, smart devices that 202 00:13:01,679 --> 00:13:07,480 Speaker 1: could navigate dumb systems. Meanwhile, in research organizations such as 203 00:13:07,520 --> 00:13:12,040 Speaker 1: the Partners for Advanced Transit and Highways or PATH, which 204 00:13:12,080 --> 00:13:15,320 Speaker 1: originated out of the University of California, you had researchers 205 00:13:15,360 --> 00:13:19,600 Speaker 1: who were looking into various approaches for autonomous driving, which 206 00:13:19,640 --> 00:13:24,120 Speaker 1: included auto highway technologies. So not just you know, how 207 00:13:24,160 --> 00:13:27,280 Speaker 1: can we make an autonomous car, but would it be 208 00:13:27,320 --> 00:13:31,560 Speaker 1: possible to make a highway system that itself was intelligent 209 00:13:32,120 --> 00:13:37,079 Speaker 1: and remove the requirement of having intelligent vehicles instead. Now, 210 00:13:37,120 --> 00:13:40,880 Speaker 1: this was in the nineteen eighties, and it started to 211 00:13:41,000 --> 00:13:43,480 Speaker 1: take up some momentum because you know, in the nineteen 212 00:13:43,480 --> 00:13:47,720 Speaker 1: eighties we certainly were miles, no pun intended miles away 213 00:13:47,800 --> 00:13:51,840 Speaker 1: from the technologies that would allow a truly autonomous vehicle. 214 00:13:52,600 --> 00:13:56,760 Speaker 1: So by n things got shifted into a higher gear. 215 00:13:56,800 --> 00:13:59,080 Speaker 1: To use another pun, I'm just gonna keep doing those 216 00:13:59,080 --> 00:14:01,760 Speaker 1: I guess I don't to, I honestly don't. But the 217 00:14:01,840 --> 00:14:07,280 Speaker 1: US government passed the Intermodal Surface Transportation Efficiency Act or 218 00:14:07,559 --> 00:14:12,120 Speaker 1: is T and that's often a question we ask here 219 00:14:12,240 --> 00:14:16,120 Speaker 1: in Atlanta, Georgia, when someone hands us a drink, we 220 00:14:16,160 --> 00:14:20,840 Speaker 1: point to and say, is t anyway? This Act laid 221 00:14:20,880 --> 00:14:24,800 Speaker 1: the groundwork for early testing of automated vehicles and roadways. 222 00:14:24,960 --> 00:14:28,040 Speaker 1: So the U. S Department of Transportation would then create 223 00:14:28,520 --> 00:14:33,000 Speaker 1: the National Automated Highway System Research Program or in a 224 00:14:33,400 --> 00:14:38,160 Speaker 1: h s RP. This program's main purpose was to establish 225 00:14:38,280 --> 00:14:43,480 Speaker 1: specifications for an automated highway system. Again, because with so 226 00:14:43,520 --> 00:14:46,880 Speaker 1: many different players in the space, like all the different 227 00:14:47,360 --> 00:14:53,280 Speaker 1: law agencies across you know, federal and state and regional systems. 228 00:14:53,760 --> 00:14:56,680 Speaker 1: You have all the different car companies that are involved, 229 00:14:57,040 --> 00:15:00,440 Speaker 1: Like there are a lot of different players who are 230 00:15:01,240 --> 00:15:06,120 Speaker 1: uh deeply concerned with things like highway technologies. You have 231 00:15:06,240 --> 00:15:11,200 Speaker 1: to establish standards so that there is interoperability between the 232 00:15:11,240 --> 00:15:14,720 Speaker 1: system and all the vehicles. Right. Again, if you have 233 00:15:14,760 --> 00:15:17,920 Speaker 1: a system that only communicates with one type of vehicle, 234 00:15:18,200 --> 00:15:21,600 Speaker 1: then it's not really a good system, right for anything 235 00:15:21,640 --> 00:15:24,720 Speaker 1: other than that vehicle, it's a terrible system, or it 236 00:15:24,800 --> 00:15:29,720 Speaker 1: might as well not exist. In the Department of Transportation 237 00:15:29,800 --> 00:15:35,120 Speaker 1: created the National Automated Highway System Consortium, So this was 238 00:15:35,200 --> 00:15:37,920 Speaker 1: kind of a think tank that was made up of 239 00:15:38,880 --> 00:15:44,640 Speaker 1: more around like a hundred different members across nine major categories. 240 00:15:44,920 --> 00:15:49,800 Speaker 1: Those categories included things like research organizations, government agencies again 241 00:15:50,040 --> 00:15:53,680 Speaker 1: at different levels like federal, state, and local. It included 242 00:15:53,840 --> 00:15:58,280 Speaker 1: academic organizations like the University of California, and of course 243 00:15:58,640 --> 00:16:03,520 Speaker 1: representatives from the transportation industry. So the hope was that 244 00:16:03,560 --> 00:16:07,440 Speaker 1: by including all of these groups in the conversation that 245 00:16:07,520 --> 00:16:10,560 Speaker 1: everyone would be able to arrive at a consensus that 246 00:16:10,600 --> 00:16:15,320 Speaker 1: would make the most viable, efficient, safe, and practical solution 247 00:16:15,960 --> 00:16:23,000 Speaker 1: for automated highways. Um also hopefully most affordable, because you 248 00:16:23,000 --> 00:16:26,200 Speaker 1: would ultimately be turning to the US taxpayer to foot 249 00:16:26,240 --> 00:16:29,640 Speaker 1: the bill for these kind of things. And I think 250 00:16:29,720 --> 00:16:33,640 Speaker 1: that was a wise move in some ways, because it's 251 00:16:33,640 --> 00:16:37,440 Speaker 1: always good to get stakeholders involved. One just to make 252 00:16:37,480 --> 00:16:40,440 Speaker 1: sure that the solutions you are designing actually solve real 253 00:16:40,440 --> 00:16:43,080 Speaker 1: world problems. There are a lot of times where we 254 00:16:43,120 --> 00:16:46,800 Speaker 1: talk about an approach or a technology and we refer 255 00:16:46,880 --> 00:16:50,280 Speaker 1: to it as a solution that's looking for a problem. 256 00:16:50,320 --> 00:16:52,640 Speaker 1: That's not ideal. Right, if you are if you have 257 00:16:52,720 --> 00:16:55,440 Speaker 1: identified what the problem is, then you can work to 258 00:16:55,640 --> 00:16:58,160 Speaker 1: solve it, but you have to make sure that your 259 00:16:58,160 --> 00:17:01,040 Speaker 1: solution actually does address the problem them. That's one reason 260 00:17:01,080 --> 00:17:03,200 Speaker 1: why you want to have lots of different stakeholders there. 261 00:17:03,560 --> 00:17:05,960 Speaker 1: You also want to have buy in across the entire 262 00:17:06,040 --> 00:17:11,240 Speaker 1: industry or else Again, interoperability becomes a problem. However, when 263 00:17:11,280 --> 00:17:14,679 Speaker 1: you include all these different voices, there's always the danger 264 00:17:14,960 --> 00:17:18,760 Speaker 1: that projects descend into a designed by committee approach where 265 00:17:18,760 --> 00:17:21,159 Speaker 1: it feels like all the really progressive concepts have been 266 00:17:21,200 --> 00:17:24,120 Speaker 1: sanded down until they aren't really interesting anymore, or worse, 267 00:17:24,960 --> 00:17:28,760 Speaker 1: you fail to find any common ground and nothing gets done. 268 00:17:29,920 --> 00:17:37,600 Speaker 1: So between and researchers poured some serious effort into conceptualizing 269 00:17:37,680 --> 00:17:42,119 Speaker 1: and automated highway system. So the benefits of such a system, 270 00:17:42,119 --> 00:17:45,359 Speaker 1: if it worked, were obvious on the surface. Like, everyone 271 00:17:45,400 --> 00:17:49,159 Speaker 1: could agree this is a good idea insofar as what 272 00:17:49,359 --> 00:17:53,400 Speaker 1: the goal is because you would reduce traffic congestion, which 273 00:17:53,440 --> 00:17:57,160 Speaker 1: would improve traveled times. It would also reduce fuel consumptions 274 00:17:57,280 --> 00:17:59,680 Speaker 1: that was an added benefit, and it would improve the 275 00:17:59,760 --> 00:18:02,439 Speaker 1: quality of life from motorists across the nation. They'd be 276 00:18:02,480 --> 00:18:07,280 Speaker 1: spending less time in transit and more time doing whatever 277 00:18:07,320 --> 00:18:10,280 Speaker 1: it was they wanted to do. So like, those are 278 00:18:10,440 --> 00:18:13,639 Speaker 1: pretty obvious benefits of such a system. The reduction of 279 00:18:13,720 --> 00:18:19,560 Speaker 1: accidents would also be incredibly important. It could save tens 280 00:18:19,600 --> 00:18:24,440 Speaker 1: of thousands of lives every year. It is impossible to 281 00:18:24,640 --> 00:18:30,840 Speaker 1: overstate how important that is. The impact of deaths due 282 00:18:30,840 --> 00:18:36,240 Speaker 1: to traffic accidents is gargantuane. There's the personal impact on 283 00:18:36,280 --> 00:18:39,400 Speaker 1: all the friends and family of the people who are lost, 284 00:18:40,080 --> 00:18:44,760 Speaker 1: there's the financial impact. There's the societal impact. You know, 285 00:18:45,040 --> 00:18:48,760 Speaker 1: we no longer have those people contributing to society. So yeah, 286 00:18:49,000 --> 00:18:52,119 Speaker 1: that's one of those things where not only is it 287 00:18:52,160 --> 00:18:56,199 Speaker 1: great just the idea of saving these lives, we all 288 00:18:56,280 --> 00:19:00,080 Speaker 1: stand to benefit from that, right, we all stand to 289 00:19:00,160 --> 00:19:02,639 Speaker 1: benefit from the fact that these people are still around 290 00:19:02,640 --> 00:19:06,080 Speaker 1: and contributing to society. Not to mention, you save a 291 00:19:06,160 --> 00:19:09,360 Speaker 1: lot of money and property in the process of reducing accidents. 292 00:19:09,880 --> 00:19:12,959 Speaker 1: So the goal was clear. How to get to the 293 00:19:13,000 --> 00:19:17,240 Speaker 1: goal was the matter of debate. That debate largely ended 294 00:19:17,680 --> 00:19:23,760 Speaker 1: in why, well, I mean, ultimately the easy answer is 295 00:19:23,800 --> 00:19:26,560 Speaker 1: the money fell out. So in part the issue was 296 00:19:26,600 --> 00:19:29,640 Speaker 1: that all those stakeholders were in fact having trouble agreeing 297 00:19:29,720 --> 00:19:32,439 Speaker 1: upon the right approach and the components that would have 298 00:19:32,520 --> 00:19:35,520 Speaker 1: to go into an automated highway as well as into 299 00:19:36,080 --> 00:19:40,439 Speaker 1: the vehicles that would be compatible with such highways. You know, 300 00:19:40,480 --> 00:19:42,439 Speaker 1: the whole goal was to build consensus, but over the 301 00:19:42,520 --> 00:19:45,400 Speaker 1: years that just became impossible to do. You had too 302 00:19:45,400 --> 00:19:49,560 Speaker 1: many different voices with different agendas, and there was no 303 00:19:49,720 --> 00:19:54,520 Speaker 1: common ground to really solidify a path forward. So nothing 304 00:19:54,600 --> 00:19:58,440 Speaker 1: was happening, and since the consortium was making very little headway, 305 00:19:58,560 --> 00:20:01,560 Speaker 1: it was pretty hard to justify continuing to fund it. 306 00:20:02,119 --> 00:20:05,320 Speaker 1: So in nineties seven, the US government passed another act. 307 00:20:05,400 --> 00:20:08,440 Speaker 1: This one was the T twenty one Act t E 308 00:20:08,680 --> 00:20:12,800 Speaker 1: A one, and that pulled financial support from the Automated 309 00:20:12,840 --> 00:20:16,200 Speaker 1: Highway Research programs to put it in other areas. The 310 00:20:16,400 --> 00:20:18,760 Speaker 1: d O T essentially made the decision to focus more 311 00:20:18,800 --> 00:20:23,640 Speaker 1: on safety oriented technologies like the aforementioned lane detection technologies, 312 00:20:24,000 --> 00:20:26,560 Speaker 1: and essentially what the government was leaning toward was where 313 00:20:26,560 --> 00:20:29,920 Speaker 1: the automotive industry was already headed, which would mean you 314 00:20:29,960 --> 00:20:35,280 Speaker 1: would have these independent autonomous vehicles or increasingly autonomous vehicles 315 00:20:35,680 --> 00:20:39,679 Speaker 1: that would navigate within an environment rather than an intelligent 316 00:20:39,960 --> 00:20:44,600 Speaker 1: environment that would guide vehicles to their respective destinations. Now, 317 00:20:45,040 --> 00:20:48,000 Speaker 1: some say the technologies we're seeing evolving vehicles in general 318 00:20:48,040 --> 00:20:51,960 Speaker 1: and autonomous vehicles in particular, could be a stepping stone 319 00:20:52,040 --> 00:20:55,960 Speaker 1: toward automated highways in the future. I'll talk more about 320 00:20:56,040 --> 00:21:09,240 Speaker 1: that after we take this quick break. So could autonomous 321 00:21:09,359 --> 00:21:15,320 Speaker 1: vehicles be a step toward intelligent highways? You know, maybe 322 00:21:15,400 --> 00:21:19,880 Speaker 1: we will see autonomous vehicle technologies converge with intelligent highways 323 00:21:20,480 --> 00:21:23,679 Speaker 1: and that the true future of motoring will be a 324 00:21:23,720 --> 00:21:29,240 Speaker 1: combination of both intelligent infrastructure and intelligent vehicles working together. 325 00:21:30,040 --> 00:21:34,080 Speaker 1: You could imagine, you know, highways that are able to 326 00:21:35,160 --> 00:21:40,879 Speaker 1: meter traffic onto them with autonomous cars so that there 327 00:21:40,920 --> 00:21:44,840 Speaker 1: are no bottlenecks on entrances and exits to the highway 328 00:21:44,840 --> 00:21:48,560 Speaker 1: for example. Right, that's a possibility. However, for all of 329 00:21:48,560 --> 00:21:51,119 Speaker 1: that to work out, we again have to create an 330 00:21:51,160 --> 00:21:55,760 Speaker 1: agree upon standards that work across all parties, all the 331 00:21:56,600 --> 00:22:00,720 Speaker 1: intelligent highway systems whenever we would design and implement those, 332 00:22:01,560 --> 00:22:04,840 Speaker 1: and all the different vehicles, or else you have cars 333 00:22:04,880 --> 00:22:08,000 Speaker 1: that won't know how to talk to highways, or cars 334 00:22:08,000 --> 00:22:09,640 Speaker 1: that don't know how to talk to each other, and 335 00:22:09,720 --> 00:22:13,800 Speaker 1: so on, and you just end up again with individual 336 00:22:13,840 --> 00:22:19,720 Speaker 1: smart vehicles and the same effect of an overall dumb system. Also, 337 00:22:20,240 --> 00:22:23,360 Speaker 1: we have to consider some of the massive challenges that 338 00:22:23,400 --> 00:22:27,840 Speaker 1: come along with building and implementing an automated highway system. So, 339 00:22:27,920 --> 00:22:32,760 Speaker 1: first of all, doing so would require massive investment in infrastructure. 340 00:22:33,400 --> 00:22:36,080 Speaker 1: That is always a tough thing to get going in 341 00:22:36,119 --> 00:22:38,840 Speaker 1: the United States for multiple reasons. Like we did see 342 00:22:38,960 --> 00:22:44,040 Speaker 1: a national infrastructure bill this year, but it did not 343 00:22:44,119 --> 00:22:46,919 Speaker 1: go easily, and it turns out like folks are not 344 00:22:47,000 --> 00:22:50,320 Speaker 1: super jazzed about having to pay more in taxes, which 345 00:22:50,400 --> 00:22:54,720 Speaker 1: is kind of understandable. Um, And without massive investment, it 346 00:22:54,760 --> 00:22:58,480 Speaker 1: would not be possible to build out a smart infrastructure, 347 00:22:59,200 --> 00:23:02,840 Speaker 1: even if we had had one that was designed and 348 00:23:02,880 --> 00:23:05,760 Speaker 1: ready to go and everyone had agreed upon this is 349 00:23:05,800 --> 00:23:08,040 Speaker 1: the right approach. Even if that were the case, and 350 00:23:08,119 --> 00:23:11,320 Speaker 1: it's not, it would still be really hard to sell 351 00:23:11,440 --> 00:23:14,480 Speaker 1: that to the American public and say this is worth 352 00:23:14,520 --> 00:23:17,000 Speaker 1: paying for. So that's one hurdle just getting the money 353 00:23:17,080 --> 00:23:22,160 Speaker 1: to pay for the rollout of such a system, let 354 00:23:22,200 --> 00:23:26,240 Speaker 1: alone the continued maintenance and operation of that system. Also 355 00:23:26,280 --> 00:23:28,679 Speaker 1: building out that kind of a system would take a 356 00:23:28,880 --> 00:23:31,760 Speaker 1: huge amount of time. You get into a lot of 357 00:23:31,800 --> 00:23:34,119 Speaker 1: real world issues with this one. I mean, like like 358 00:23:34,200 --> 00:23:38,560 Speaker 1: finding the right contractors to do the work is it's 359 00:23:38,640 --> 00:23:40,560 Speaker 1: kind of a boring thing to talk about, but it's 360 00:23:40,760 --> 00:23:44,000 Speaker 1: realistically one of the big challenges. I mean, hopefully the 361 00:23:44,400 --> 00:23:47,120 Speaker 1: government ends up finding contractors who don't end up having 362 00:23:47,160 --> 00:23:49,960 Speaker 1: suspicious links to the people who hold the purse strings, 363 00:23:50,520 --> 00:23:53,600 Speaker 1: because that happens all the time and it's never good. 364 00:23:54,280 --> 00:23:57,679 Speaker 1: Um do. You also have to deal with unexpected problems 365 00:23:57,760 --> 00:24:01,359 Speaker 1: as you're deploying systems. This happens in any system, doesn't 366 00:24:01,359 --> 00:24:05,359 Speaker 1: matter how big or how complicated it is. It will happen, 367 00:24:05,359 --> 00:24:09,760 Speaker 1: and with something this huge, it will happen a lot. Uh. Also, 368 00:24:09,880 --> 00:24:11,720 Speaker 1: just building out the system so it reaches all the 369 00:24:11,760 --> 00:24:15,560 Speaker 1: areas necessary. I mean, the broadband initiatives in this country 370 00:24:15,680 --> 00:24:21,960 Speaker 1: show that's very easy to underserve or even not serve 371 00:24:22,080 --> 00:24:26,879 Speaker 1: at all certain regions. That would be a huge disadvantage 372 00:24:26,880 --> 00:24:29,080 Speaker 1: to people who live, in work in those regions. So 373 00:24:30,200 --> 00:24:34,200 Speaker 1: this is a ton of work, and it's it's again, 374 00:24:34,240 --> 00:24:37,239 Speaker 1: it's not impossible, but it is very hard, and the 375 00:24:37,280 --> 00:24:40,280 Speaker 1: harder it is, the less likely we're going to see 376 00:24:40,280 --> 00:24:44,200 Speaker 1: it happen. The actual technical requirements would be pretty darn 377 00:24:44,320 --> 00:24:48,560 Speaker 1: daunting too. You would need a system capable of detecting, controlling, 378 00:24:48,880 --> 00:24:52,879 Speaker 1: and coordinating every vehicle on the road, and the system 379 00:24:52,920 --> 00:24:58,120 Speaker 1: would have to have an exceptionally high reliability factor because Obviously, 380 00:24:58,200 --> 00:25:01,199 Speaker 1: mistakes could lead to accidents, which could mean loss of 381 00:25:01,240 --> 00:25:06,040 Speaker 1: property and obviously much much worse. In some ways, the 382 00:25:06,160 --> 00:25:10,520 Speaker 1: system could be simpler than what you see with autonomous vehicles, 383 00:25:10,560 --> 00:25:13,040 Speaker 1: at least as long as you know conditions are normal. 384 00:25:13,520 --> 00:25:16,520 Speaker 1: But then we all know that driving conditions are paradoxically 385 00:25:16,880 --> 00:25:20,119 Speaker 1: rarely normal. All it takes is an animal running across 386 00:25:20,160 --> 00:25:24,240 Speaker 1: the lane of traffic or pop up heavy thunderstorm to 387 00:25:24,280 --> 00:25:29,120 Speaker 1: turn a normal drive into a dangerous one, and automated 388 00:25:29,160 --> 00:25:31,239 Speaker 1: highway system would need to be able to detect and 389 00:25:31,280 --> 00:25:33,720 Speaker 1: cope with these sort of things in a safe and 390 00:25:33,800 --> 00:25:39,040 Speaker 1: reliable way that wouldn't affect the overall system operations. Right, So, 391 00:25:39,119 --> 00:25:41,919 Speaker 1: let's say you've got a big old pop up thunderstorm 392 00:25:41,960 --> 00:25:44,800 Speaker 1: along one stretch of highway. While you would have to 393 00:25:44,800 --> 00:25:49,479 Speaker 1: have the highway deal with this all down the pathway 394 00:25:49,520 --> 00:25:53,040 Speaker 1: of the road so that you didn't have congestion and 395 00:25:53,080 --> 00:25:56,560 Speaker 1: traffic jams, because while the vehicles moving through the very 396 00:25:56,560 --> 00:25:59,560 Speaker 1: heavy storm might have to reduce their speed, that means well, 397 00:26:00,040 --> 00:26:02,600 Speaker 1: three miles back where there is no rain, you're gonna 398 00:26:02,600 --> 00:26:04,000 Speaker 1: have to deal with that too. You're gonna have to 399 00:26:04,000 --> 00:26:07,840 Speaker 1: reduce speed there or else you're going to start encountering 400 00:26:07,840 --> 00:26:11,400 Speaker 1: traffic congestions, so like, these are really complicated systems. Right, 401 00:26:12,920 --> 00:26:16,560 Speaker 1: then there are privacy concerns. So presumably an automated highway 402 00:26:16,560 --> 00:26:20,479 Speaker 1: system would have to differentiate against all the different vehicles 403 00:26:20,520 --> 00:26:23,760 Speaker 1: traveling on the lanes. Right. If you didn't differentiate all 404 00:26:23,760 --> 00:26:26,000 Speaker 1: the vehicles, that would be disastrous because you wouldn't know 405 00:26:26,080 --> 00:26:29,000 Speaker 1: that vehicle one and vehicle two are in fact two 406 00:26:29,080 --> 00:26:34,359 Speaker 1: different vehicles. So maybe the system would assign a unique 407 00:26:34,400 --> 00:26:38,520 Speaker 1: identifier to each vehicle as it entered the highway. Maybe 408 00:26:38,600 --> 00:26:43,480 Speaker 1: manufacturers would include a unique identifier within a car's systems itself, 409 00:26:43,520 --> 00:26:46,560 Speaker 1: and the highway would detect that. But either way, the 410 00:26:46,640 --> 00:26:50,720 Speaker 1: highway will quote unquote no where each car is going 411 00:26:51,040 --> 00:26:52,920 Speaker 1: when it got on the highway, when it got off 412 00:26:52,920 --> 00:26:56,520 Speaker 1: the highway. That means that each and every driver on 413 00:26:56,560 --> 00:26:59,879 Speaker 1: that highway, at least every driver who has a compatible system. 414 00:27:00,200 --> 00:27:03,119 Speaker 1: More on that in a second, will be tracked as 415 00:27:03,200 --> 00:27:05,600 Speaker 1: long as they are on that highway. That is a 416 00:27:05,640 --> 00:27:09,760 Speaker 1: major privacy concern, So that's another issue that has to 417 00:27:09,760 --> 00:27:13,000 Speaker 1: be solved. Then that brings us to the other big hurdle, 418 00:27:13,280 --> 00:27:18,359 Speaker 1: which is adoption. By that I mean cars won't magically 419 00:27:18,640 --> 00:27:22,360 Speaker 1: be able to communicate with an automatic highway system. Let's 420 00:27:22,440 --> 00:27:26,760 Speaker 1: let's imagine that you know, we've designed and built out 421 00:27:26,960 --> 00:27:33,720 Speaker 1: an incredible intelligent highway system. Uh down this one particular highway. Well, 422 00:27:34,440 --> 00:27:38,000 Speaker 1: you have to have cars that can interact with that system, right, 423 00:27:38,040 --> 00:27:41,199 Speaker 1: I mean there's no there's no magical connection there. So 424 00:27:41,240 --> 00:27:43,720 Speaker 1: the cars themselves have to have required systems in order 425 00:27:43,720 --> 00:27:47,760 Speaker 1: to interoperate with the highways. Now, if you had taken 426 00:27:47,760 --> 00:27:51,760 Speaker 1: a very collaborative approach along with like car manufacturing companies 427 00:27:52,280 --> 00:27:55,800 Speaker 1: and you worked with them, well, potentially those car manufacturers 428 00:27:56,080 --> 00:27:59,680 Speaker 1: might have started to include compatible systems in their vehicles 429 00:28:00,200 --> 00:28:03,480 Speaker 1: in the years leading up to the deployment of the 430 00:28:03,520 --> 00:28:06,960 Speaker 1: intelligent system on the highway. So in other words, maybe 431 00:28:06,960 --> 00:28:10,680 Speaker 1: you've got two or three years worth of vehicle models 432 00:28:10,720 --> 00:28:13,879 Speaker 1: that already have those systems in place. They just you know, 433 00:28:13,920 --> 00:28:17,000 Speaker 1: weren't connecting to anything until the highway was ready to go. 434 00:28:17,800 --> 00:28:20,200 Speaker 1: That would help a little bit, right, People who had 435 00:28:20,200 --> 00:28:25,000 Speaker 1: newer cars would be able to interact with this intelligent highway. 436 00:28:25,240 --> 00:28:27,119 Speaker 1: But it's not like you're going to convince every single 437 00:28:27,160 --> 00:28:30,160 Speaker 1: person to trade in their vehicle for a new one, 438 00:28:30,760 --> 00:28:33,800 Speaker 1: and that means you're automated highway system will still have 439 00:28:33,880 --> 00:28:37,360 Speaker 1: human operated vehicles on it. It will only be communicating 440 00:28:37,400 --> 00:28:40,640 Speaker 1: with certain cars on the highway, and everybody else will 441 00:28:40,680 --> 00:28:45,440 Speaker 1: still be regular or humans driving on on the roads. 442 00:28:45,480 --> 00:28:51,239 Speaker 1: That adds in uncertainty, right, That's an uncontrolled variable in 443 00:28:51,280 --> 00:28:54,880 Speaker 1: the system. So even if the system is working flawlessly, 444 00:28:55,240 --> 00:29:00,400 Speaker 1: these uncontrolled variables can cause problems. That's very dangerous and 445 00:29:00,480 --> 00:29:02,600 Speaker 1: there's not really an easy solution to that. In fact, 446 00:29:03,040 --> 00:29:06,040 Speaker 1: you know, even if you have a system that works, 447 00:29:06,480 --> 00:29:08,720 Speaker 1: what you're likely going to see is that the only 448 00:29:08,760 --> 00:29:11,640 Speaker 1: the people who are buying the most expensive cars will 449 00:29:11,680 --> 00:29:14,560 Speaker 1: initially be able to interact with it, which will create 450 00:29:14,720 --> 00:29:18,320 Speaker 1: haves and have nuts on the highway. And that doesn't 451 00:29:18,320 --> 00:29:21,040 Speaker 1: sound like it's a really good system either. Now, one 452 00:29:21,040 --> 00:29:24,440 Speaker 1: way you could potentially get around this, it's not a 453 00:29:24,520 --> 00:29:28,040 Speaker 1: practical solution, but it's it's kind of like a pie 454 00:29:28,040 --> 00:29:31,440 Speaker 1: in the sky sort of solution, is to incorporate something 455 00:29:31,520 --> 00:29:34,320 Speaker 1: similar to one of the many variations we've seen on 456 00:29:34,400 --> 00:29:39,800 Speaker 1: the hyper loop concept, namely the sled. So one of 457 00:29:39,880 --> 00:29:45,160 Speaker 1: Elon Musk's hyper loop concepts involved driving your vehicle onto 458 00:29:45,200 --> 00:29:49,560 Speaker 1: a sled that in turn could navigate through the hyper 459 00:29:49,600 --> 00:29:53,239 Speaker 1: loop system. So with this concept, drivers would drive up 460 00:29:53,320 --> 00:29:57,240 Speaker 1: and park their car or other vehicle on a sled, 461 00:29:58,000 --> 00:30:01,239 Speaker 1: and the sled would then navigate onto a track of 462 00:30:01,320 --> 00:30:06,560 Speaker 1: some sort like a highway, So the car remains parked, 463 00:30:06,760 --> 00:30:09,560 Speaker 1: but the sleds zooms off to the destination, where upon 464 00:30:09,600 --> 00:30:12,840 Speaker 1: the driver then pulls off of the sled in their 465 00:30:12,880 --> 00:30:15,440 Speaker 1: their vehicle and they drive the last mile or whatever. 466 00:30:16,040 --> 00:30:19,680 Speaker 1: An automated highway system that used something similar to this 467 00:30:19,960 --> 00:30:22,280 Speaker 1: could get around the fact that not all the vehicles 468 00:30:22,280 --> 00:30:24,120 Speaker 1: on the road are compatible with the system. In fact, 469 00:30:24,200 --> 00:30:26,520 Speaker 1: none of the vehicles would need to be compatible. It's 470 00:30:26,560 --> 00:30:30,640 Speaker 1: the sleds that are compatible, not the vehicles. But in 471 00:30:30,720 --> 00:30:33,200 Speaker 1: order to do that, you would have to build out 472 00:30:33,640 --> 00:30:37,040 Speaker 1: these sled based highway systems like you'd essentially have to 473 00:30:37,080 --> 00:30:41,080 Speaker 1: recreate all the highways that exist or slowly convert them 474 00:30:41,120 --> 00:30:45,640 Speaker 1: over while also allowing for traditional traffic to pass through them. 475 00:30:45,680 --> 00:30:48,520 Speaker 1: That is why it doesn't sound like a practical solution 476 00:30:48,560 --> 00:30:51,000 Speaker 1: to me. It just sounds like it's too big of 477 00:30:51,040 --> 00:30:57,400 Speaker 1: a problem to tackle. Even with an idealized implementation, you 478 00:30:57,480 --> 00:31:01,360 Speaker 1: still have concerns. The concept covers highways but not other 479 00:31:01,400 --> 00:31:05,160 Speaker 1: types of roads, which means whenever you're not on the highway, 480 00:31:05,200 --> 00:31:08,600 Speaker 1: the vehicle will presumably be under human control. If it's 481 00:31:08,680 --> 00:31:11,320 Speaker 1: an autonomous vehicle, well then the question is, well, why 482 00:31:11,360 --> 00:31:13,880 Speaker 1: do we need an automated highway? If all the vehicles 483 00:31:13,880 --> 00:31:17,800 Speaker 1: are autonomous anyway, then we don't need the automated highway. 484 00:31:18,280 --> 00:31:21,720 Speaker 1: If the vehicle isn't autonomous, well then you still have issues. 485 00:31:21,760 --> 00:31:25,480 Speaker 1: When you're talking about entrances and exits to the highway, 486 00:31:25,480 --> 00:31:29,320 Speaker 1: those could become bottlenecks and congestion points. Sure, traffic along 487 00:31:29,360 --> 00:31:32,600 Speaker 1: the highway itself is smooth, but getting onto the highway 488 00:31:32,680 --> 00:31:34,640 Speaker 1: or getting off the highway, that could be a very 489 00:31:34,680 --> 00:31:39,800 Speaker 1: different situation. So it could create a new kind of headache. 490 00:31:40,200 --> 00:31:42,000 Speaker 1: And there are a lot of other issues as well. 491 00:31:42,600 --> 00:31:45,800 Speaker 1: I think people are more in favor of car companies 492 00:31:46,240 --> 00:31:49,240 Speaker 1: doing all the R and D work and the implementation 493 00:31:49,440 --> 00:31:53,920 Speaker 1: of technologies to make cars smarter and safer. Then it's 494 00:31:53,960 --> 00:31:56,800 Speaker 1: just up to the individual customer to decide if they 495 00:31:56,840 --> 00:32:00,120 Speaker 1: want to pay for a vehicle that has those features, 496 00:32:00,760 --> 00:32:04,680 Speaker 1: instead of collectively supporting a national effort to transform the 497 00:32:04,760 --> 00:32:07,960 Speaker 1: actual highway system. Like when you leave it up to 498 00:32:08,040 --> 00:32:11,960 Speaker 1: the individual, it's more palatable to people in the United States, 499 00:32:11,960 --> 00:32:13,960 Speaker 1: because here in the US there's always been more of 500 00:32:14,000 --> 00:32:19,720 Speaker 1: an inclination toward supporting individual freedom over buying into a 501 00:32:19,800 --> 00:32:24,440 Speaker 1: larger program that, if it works, could benefit everyone. That's 502 00:32:24,480 --> 00:32:26,520 Speaker 1: not a judgment on my part. I mean it kind 503 00:32:26,520 --> 00:32:28,840 Speaker 1: of is a little bit, because you know, I do 504 00:32:29,040 --> 00:32:35,920 Speaker 1: tend toward the more um collaborative spectrum. But we have 505 00:32:36,120 --> 00:32:39,880 Speaker 1: plenty of examples of government projects that failed somewhere along 506 00:32:39,880 --> 00:32:44,360 Speaker 1: the way, So I wouldn't say that the inclination toward 507 00:32:44,640 --> 00:32:48,160 Speaker 1: individual freedom is entirely selfish. I think part of it 508 00:32:48,200 --> 00:32:52,520 Speaker 1: is somewhat practical, simply because there are these examples of 509 00:32:52,560 --> 00:32:57,600 Speaker 1: big government programs that failed which show a massive sunk cost, 510 00:32:58,040 --> 00:33:01,760 Speaker 1: and it's possible that on made highways could be in 511 00:33:01,800 --> 00:33:05,760 Speaker 1: that category. So yeah, I think that's important to to 512 00:33:05,880 --> 00:33:08,560 Speaker 1: point out. Now there has really been a shift more 513 00:33:08,640 --> 00:33:13,000 Speaker 1: towards the concept of intelligent highways, which is sort of 514 00:33:13,040 --> 00:33:14,920 Speaker 1: it's really just a horse of a different color. It's 515 00:33:14,960 --> 00:33:18,200 Speaker 1: the same same animal, just as a different name. The 516 00:33:18,600 --> 00:33:22,960 Speaker 1: intelligent highways largely talk about embedded technology along highways that 517 00:33:23,640 --> 00:33:27,120 Speaker 1: can communicate and interact on at least some level with 518 00:33:27,200 --> 00:33:30,960 Speaker 1: the vehicles that are traveling along the highways. In some 519 00:33:31,040 --> 00:33:33,920 Speaker 1: cases we're talking about things like a dedicated lane for 520 00:33:34,720 --> 00:33:38,680 Speaker 1: autonomous vehicles, But again, the still brings up the challenge 521 00:33:38,800 --> 00:33:43,360 Speaker 1: of having to create a standard that everyone agrees to 522 00:33:43,400 --> 00:33:49,160 Speaker 1: play by so that the technologies are compatible. Otherwise it 523 00:33:49,280 --> 00:33:52,120 Speaker 1: ends up working for just a tiny slice of the 524 00:33:52,160 --> 00:33:58,160 Speaker 1: population and represents a massive, uh inefficient approach to trying 525 00:33:58,200 --> 00:34:03,360 Speaker 1: to solve a very difficult problem. So I find the 526 00:34:03,480 --> 00:34:08,120 Speaker 1: idea of the automated highway system really attractive from the 527 00:34:08,280 --> 00:34:10,920 Speaker 1: end result, But the problem is getting to the end 528 00:34:10,960 --> 00:34:15,200 Speaker 1: result requires so many different pieces falling into place that 529 00:34:15,560 --> 00:34:19,600 Speaker 1: it is, like I said, close to impossible. Uh you know, 530 00:34:19,800 --> 00:34:23,680 Speaker 1: if you had buy in everywhere, then maybe, like I think, 531 00:34:23,760 --> 00:34:25,799 Speaker 1: if we lived in a very different world, which would 532 00:34:25,800 --> 00:34:27,799 Speaker 1: not necessarily be a world I'd want to live in, 533 00:34:28,160 --> 00:34:29,959 Speaker 1: but if we lived in a very different world where 534 00:34:30,000 --> 00:34:34,880 Speaker 1: you had state run everything like state essentially you're talking 535 00:34:34,880 --> 00:34:38,960 Speaker 1: about like a truly socialist or maybe even communist kind 536 00:34:38,960 --> 00:34:42,560 Speaker 1: of society, then you could have a system where everything 537 00:34:42,600 --> 00:34:48,839 Speaker 1: works together. But we've seen pure socialist and communist societies 538 00:34:48,840 --> 00:34:52,920 Speaker 1: that you know, when you throw people into a system, 539 00:34:53,040 --> 00:34:59,000 Speaker 1: it doesn't work as perfectly as the concept necessarily made 540 00:34:59,040 --> 00:35:01,600 Speaker 1: it out to be, so I don't think it's practical. 541 00:35:02,400 --> 00:35:07,440 Speaker 1: I wish that it were, because I definitely see issues 542 00:35:07,520 --> 00:35:10,040 Speaker 1: with the way we are going. We're relying on the 543 00:35:10,200 --> 00:35:14,680 Speaker 1: smart vehicle and the dumb infrastructure. I see problems with 544 00:35:14,719 --> 00:35:17,480 Speaker 1: that as well. It's not like all the problems disappear. 545 00:35:17,560 --> 00:35:21,480 Speaker 1: It's just that they become a little more manageable, a 546 00:35:21,520 --> 00:35:24,520 Speaker 1: little more contained than we see when we try to 547 00:35:24,560 --> 00:35:28,400 Speaker 1: address the overall system. So there's no perfect approach I 548 00:35:28,400 --> 00:35:31,839 Speaker 1: don't think. I do think that while I would love 549 00:35:31,880 --> 00:35:35,839 Speaker 1: to see a more integrated system that allows for rich 550 00:35:35,920 --> 00:35:40,279 Speaker 1: communication between the infrastructure and vehicles, because I think that 551 00:35:40,280 --> 00:35:44,200 Speaker 1: that would greatly boost effectiveness, I don't see it as 552 00:35:44,239 --> 00:35:47,440 Speaker 1: being something that we can realistically expect any time in 553 00:35:47,480 --> 00:35:52,440 Speaker 1: the foreseeable future. Still, maybe that vision from way back 554 00:35:52,480 --> 00:35:56,040 Speaker 1: in ninety nine, maybe it's still on the horizon, just 555 00:35:56,120 --> 00:35:59,760 Speaker 1: like the pavilion was called. Maybe that horizon is still there. 556 00:36:00,320 --> 00:36:03,040 Speaker 1: Maybe we're still headed toward it. It's just that our 557 00:36:03,200 --> 00:36:05,960 Speaker 1: road is a little more long and winding than we 558 00:36:06,080 --> 00:36:10,560 Speaker 1: first anticipated. I hope you enjoyed this episode. If you 559 00:36:10,600 --> 00:36:13,000 Speaker 1: have suggestions for future topics, please reach out to me. 560 00:36:13,080 --> 00:36:15,640 Speaker 1: You can do that by downloading the i Heart Radio app, 561 00:36:16,080 --> 00:36:19,440 Speaker 1: navigating over to the tech Stuff podcast section, click on 562 00:36:19,480 --> 00:36:22,000 Speaker 1: that little microphone icon. You can leave me a voice 563 00:36:22,040 --> 00:36:25,080 Speaker 1: message up to thirty seconds in length, or you can 564 00:36:25,080 --> 00:36:27,200 Speaker 1: reach out to me on Twitter. The handle for the 565 00:36:27,200 --> 00:36:30,160 Speaker 1: show is tech Stuff h s W. I look forward 566 00:36:30,200 --> 00:36:32,360 Speaker 1: to hearing from you, and I'll talk to you again 567 00:36:33,080 --> 00:36:42,200 Speaker 1: really soon. Text Stuff is an I heart Radio production. 568 00:36:42,440 --> 00:36:45,239 Speaker 1: For more podcasts from my heart Radio, visit the i 569 00:36:45,360 --> 00:36:48,600 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 570 00:36:48,640 --> 00:36:49,560 Speaker 1: your favorite shows